/ex2/ex2/costFunctionReg.m
https://bitbucket.org/erogol/machine-learning-coursera-assignment-codes · Objective C · 32 lines · 23 code · 9 blank · 0 comment · 3 complexity · 4ebc7f07266cf1e503a8e083dda293db MD5 · raw file
- function [J, grad] = costFunctionReg(theta, X, y, lambda)
- %COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
- % J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
- % theta as the parameter for regularized logistic regression and the
- % gradient of the cost w.r.t. to the parameters.
- % Initialize some useful values
- m = length(y); % number of training examples
- % You need to return the following variables correctly
- J = 0;
- grad = zeros(size(theta));
- % ====================== YOUR CODE HERE ======================
- % Instructions: Compute the cost of a particular choice of theta.
- % You should set J to the cost.
- % Compute the partial derivatives and set grad to the partial
- % derivatives of the cost w.r.t. each parameter in theta
- h = sigmoid(X*theta);
- J = (-1/m*sum((y.*log(h))+((1-y).*log(1-h))))+((lambda/(2*m))*(sum(power(theta,2))-power(theta(1),2)));
- grad(1) = (1/m).*sum((h-y).*X(:,1));
- for i = 2:numel(theta)
- grad(i) = ((1/m).*sum((h - y).*X(:,i)))+(lambda/m)*theta(i);
- end
- % =============================================================
- end