PageRenderTime 2347ms CodeModel.GetById 31ms RepoModel.GetById 1ms app.codeStats 0ms

/ex2/ex2/costFunctionReg.m

https://bitbucket.org/erogol/machine-learning-coursera-assignment-codes
Objective C | 32 lines | 23 code | 9 blank | 0 comment | 3 complexity | 4ebc7f07266cf1e503a8e083dda293db MD5 | raw file
  1. function [J, grad] = costFunctionReg(theta, X, y, lambda)
  2. %COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
  3. % J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
  4. % theta as the parameter for regularized logistic regression and the
  5. % gradient of the cost w.r.t. to the parameters.
  6. % Initialize some useful values
  7. m = length(y); % number of training examples
  8. % You need to return the following variables correctly
  9. J = 0;
  10. grad = zeros(size(theta));
  11. % ====================== YOUR CODE HERE ======================
  12. % Instructions: Compute the cost of a particular choice of theta.
  13. % You should set J to the cost.
  14. % Compute the partial derivatives and set grad to the partial
  15. % derivatives of the cost w.r.t. each parameter in theta
  16. h = sigmoid(X*theta);
  17. J = (-1/m*sum((y.*log(h))+((1-y).*log(1-h))))+((lambda/(2*m))*(sum(power(theta,2))-power(theta(1),2)));
  18. grad(1) = (1/m).*sum((h-y).*X(:,1));
  19. for i = 2:numel(theta)
  20. grad(i) = ((1/m).*sum((h - y).*X(:,i)))+(lambda/m)*theta(i);
  21. end
  22. % =============================================================
  23. end