/exercises/mlclass-ex1/gradientDescent.m
Objective C | 35 lines | 28 code | 7 blank | 0 comment | 2 complexity | 24abfce24b6f861c6bbb94bf1ceb41a2 MD5 | raw file
1function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters) 2%GRADIENTDESCENT Performs gradient descent to learn theta 3% theta = GRADIENTDESENT(X, y, theta, alpha, num_iters) updates theta by 4% taking num_iters gradient steps with learning rate alpha 5 6% Initialize some useful values 7m = length(y); % number of training examples 8J_history = zeros(num_iters, 1); 9num_vars=size(X,2); 10 11for iter = 1:num_iters 12 13 % ====================== YOUR CODE HERE ====================== 14 % Instructions: Perform a single gradient step on the parameter vector 15 % theta. 16 % 17 % Hint: While debugging, it can be useful to print out the values 18 % of the cost function (computeCost) and gradient here. 19 % 20 theta_tmp=zeros(num_vars,1); 21 for j=1:num_vars 22 theta_tmp(j)= theta(j) - (alpha/m) * sum(((X*theta)-y).*X(:,j)); 23 end 24 theta=theta_tmp; 25 % by @dnene 26 % theta = theta - (X' * (X * theta - y) * alpha / m); 27 28 % ============================================================ 29 30 % Save the cost J in every iteration 31 J_history(iter) = computeCost(X, y, theta); 32 33end 34 35end