changeset 2:882ffde0ce47

Implement learningCurve
author Jordi Gutiérrez Hermoso <jordigh@octave.org>
date Mon, 21 Nov 2011 00:36:59 -0500
parents 9a9f76850dc6
children 7f4c22114134
files learningCurve.m
diffstat 1 files changed, 28 insertions(+), 64 deletions(-) [+]
line wrap: on
line diff
--- a/learningCurve.m
+++ b/learningCurve.m
@@ -1,68 +1,32 @@
-function [error_train, error_val] = ...
-    learningCurve(X, y, Xval, yval, lambda)
-%LEARNINGCURVE Generates the train and cross validation set errors needed 
-%to plot a learning curve
-%   [error_train, error_val] = ...
-%       LEARNINGCURVE(X, y, Xval, yval, lambda) returns the train and
-%       cross validation set errors for a learning curve. In particular, 
-%       it returns two vectors of the same length - error_train and 
-%       error_val. Then, error_train(i) contains the training error for
-%       i examples (and similarly for error_val(i)).
-%
-%   In this function, you will compute the train and test errors for
-%   dataset sizes from 1 up to m. In practice, when working with larger
-%   datasets, you might want to do this in larger intervals.
-%
-
-% Number of training examples
-m = size(X, 1);
-
-% You need to return these values correctly
-error_train = zeros(m, 1);
-error_val   = zeros(m, 1);
+function [error_train, error_val] = learningCurve(X, y, Xval, yval, lambda)
+  ##LEARNINGCURVE Generates the train and cross validation set errors needed 
+  ##to plot a learning curve
+  ##   [error_train, error_val] = ...
+  ##       LEARNINGCURVE(X, y, Xval, yval, lambda) returns the train and
+  ##       cross validation set errors for a learning curve. In particular, 
+  ##       it returns two vectors of the same length - error_train and 
+  ##       error_val. Then, error_train(i) contains the training error for
+  ##       i examples (and similarly for error_val(i)).
+  ##
+  ##   In this function, you will compute the train and test errors for
+  ##   dataset sizes from 1 up to m. In practice, when working with larger
+  ##   datasets, you might want to do this in larger intervals.
+  ##
 
-% ====================== YOUR CODE HERE ======================
-% Instructions: Fill in this function to return training errors in 
-%               error_train and the cross validation errors in error_val. 
-%               The vector numex_vec contains the number of training 
-%               examples to use for each calculation of training error and 
-%               cross validation error, i.e, error_train(i) and 
-%               error_val(i) should give you the errors
-%               obtained after training on i examples.
-%
-% Note: You should evaluate the training error on the first i training
-%       examples (i.e., X(1:i, :) and y(1:i)).
-%
-%       For the cross-validation error, you should instead evaluate on
-%       the _entire_ cross validation set (Xval and yval).
-%
-% Note: If you are using your cost function (linearRegCostFunction)
-%       to compute the training and cross validation error, you should 
-%       call the function with the lambda argument set to 0. 
-%       Do note that you will still need to use lambda when running
-%       the training to obtain the theta parameters.
-%
-% Hint: You can loop over the examples with the following:
-%
-%       for i = 1:m
-%           % Compute train/cross validation errors using training examples 
-%           % X(1:i, :) and y(1:i), storing the result in 
-%           % error_train(i) and error_val(i)
-%           ....
-%           
-%       end
-%
+  ## Number of training examples
+  m = rows (X);
 
-% ---------------------- Sample Solution ----------------------
-
-
-
-
-
+  ## Initialise outputs
+  error_train = zeros(m, 1);
+  error_val   = zeros(m, 1);
+  
+  for i = 1:m
+    Xtrain = X(1:i, :);
+    ytrain = y(1:i);
+    theta = trainLinearReg (Xtrain, ytrain, lambda);
 
-
-% -------------------------------------------------------------
+    error_train(i) = sumsq (Xtrain*theta - ytrain)/(2*i);
+    error_val(i) = sumsq (Xval*theta - yval)/(2*length (yval));
+  endfor
 
-% =========================================================================
-
-end
+endfunction