Mercurial > hg > machine-learning-hw1
changeset 10:d5d658017ea6
Copy into *multi.m functions
author | Jordi Gutiérrez Hermoso <jordigh@octave.org> |
---|---|
date | Wed, 26 Oct 2011 07:55:25 -0700 |
parents | f10c5a008bb4 |
children | 67de2de0e027 |
files | computeCostMulti.m gradientDescentMulti.m |
diffstat | 2 files changed, 18 insertions(+), 51 deletions(-) [+] |
line wrap: on
line diff
--- a/computeCostMulti.m +++ b/computeCostMulti.m @@ -1,22 +1,8 @@ function J = computeCostMulti(X, y, theta) -%COMPUTECOSTMULTI Compute cost for linear regression with multiple variables -% J = COMPUTECOSTMULTI(X, y, theta) computes the cost of using theta as the -% parameter for linear regression to fit the data points in X and y - -% Initialize some useful values -m = length(y); % number of training examples - -% You need to return the following variables correctly -J = 0; +##COMPUTECOSTMULTI Compute cost for linear regression with multiple variables +## J = COMPUTECOSTMULTI(X, y, theta) computes the cost of using theta as the +## parameter for linear regression to fit the data points in X and y -% ====================== YOUR CODE HERE ====================== -% Instructions: Compute the cost of a particular choice of theta -% You should set J to the cost. - - + J = sum ((X*theta - y).^2)/(2*rows (y)); - - -% ========================================================================= - -end +endfunction
--- a/gradientDescentMulti.m +++ b/gradientDescentMulti.m @@ -1,37 +1,18 @@ -function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters) -%GRADIENTDESCENTMULTI Performs gradient descent to learn theta -% theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by -% taking num_iters gradient steps with learning rate alpha - -% Initialize some useful values -m = length(y); % number of training examples -J_history = zeros(num_iters, 1); - -for iter = 1:num_iters - - % ====================== YOUR CODE HERE ====================== - % Instructions: Perform a single gradient step on the parameter vector - % theta. - % - % Hint: While debugging, it can be useful to print out the values - % of the cost function (computeCostMulti) and gradient here. - % +function [theta, J_history] = gradientDescentMulti(X, y, + theta, alpha, num_iters) + ##GRADIENTDESCENTMULTI Performs gradient descent to learn theta + ## theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) + ## updates theta by taking num_iters gradient steps with learning + ## rate alpha - - - + ## Initialize some useful values + J_history = zeros(num_iters, 1); - - - - + for iter = 1:num_iters - - % ============================================================ + J_history(iter) = computeCost(X, y, theta); + theta -= alpha*X'*(X*theta - y)/rows (y); - % Save the cost J in every iteration - J_history(iter) = computeCostMulti(X, y, theta); + endfor -end - -end +endfunction