changeset 1:9a9f76850dc6

Implement linearRegCostFunction
author Jordi Gutiérrez Hermoso <jordigh@octave.org>
date Sun, 20 Nov 2011 23:42:47 -0500
parents 0f14514e907f
children 882ffde0ce47
files linearRegCostFunction.m
diffstat 1 files changed, 10 insertions(+), 33 deletions(-) [+]
line wrap: on
line diff
--- a/linearRegCostFunction.m
+++ b/linearRegCostFunction.m
@@ -1,37 +1,14 @@
 function [J, grad] = linearRegCostFunction(X, y, theta, lambda)
-%LINEARREGCOSTFUNCTION Compute cost and gradient for regularized linear 
-%regression with multiple variables
-%   [J, grad] = LINEARREGCOSTFUNCTION(X, y, theta, lambda) computes the 
-%   cost of using theta as the parameter for linear regression to fit the 
-%   data points in X and y. Returns the cost in J and the gradient in grad
-
-% Initialize some useful values
-m = length(y); % number of training examples
-
-% You need to return the following variables correctly 
-J = 0;
-grad = zeros(size(theta));
+  ##LINEARREGCOSTFUNCTION Compute cost and gradient for regularized linear 
+  ##regression with multiple variables
+  ##   [J, grad] = LINEARREGCOSTFUNCTION(X, y, theta, lambda) computes the 
+  ##   cost of using theta as the parameter for linear regression to fit the 
+  ##   data points in X and y. Returns the cost in J and the gradient in grad
 
-% ====================== YOUR CODE HERE ======================
-% Instructions: Compute the cost and gradient of regularized linear 
-%               regression for a particular choice of theta.
-%
-%               You should set J to the cost and grad to the gradient.
-%
-
-
-
-
-
+  m = length (y);
+  ht = X*theta;
+  J = (sumsq (ht - y) + lambda*sumsq (theta(2:end)))/(2*m);
 
-
-
-
-
-
+  grad = (X'*(ht - y) + [0; lambda*theta(2:end)])/m;
 
-% =========================================================================
-
-grad = grad(:);
-
-end
+endfunction