changeset 4:4fb05328d3cf

Implement costFunctionReg
author Jordi Gutiérrez Hermoso <jordigh@octave.org>
date Sat, 29 Oct 2011 22:29:53 -0500
parents 0c89cf3fe327
children a4c4da8f4ac0
files costFunctionReg.m
diffstat 1 files changed, 13 insertions(+), 21 deletions(-) [+]
line wrap: on
line diff
--- a/costFunctionReg.m
+++ b/costFunctionReg.m
@@ -1,27 +1,19 @@
 function [J, grad] = costFunctionReg(theta, X, y, lambda)
-%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
-%   J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
-%   theta as the parameter for regularized logistic regression and the
-%   gradient of the cost w.r.t. to the parameters. 
+  ##COSTFUNCTIONREG Compute cost and gradient for logistic regression
+  ##with regularization
+  ##   J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
+  ##   theta as the parameter for regularized logistic regression and the
+  ##   gradient of the cost w.r.t. to the parameters. 
 
-% Initialize some useful values
-m = length(y); % number of training examples
-
-% You need to return the following variables correctly 
-J = 0;
-grad = zeros(size(theta));
+  ## Initialize some useful values
+  m = length(y); ## number of training examples
 
-% ====================== YOUR CODE HERE ======================
-% Instructions: Compute the cost of a particular choice of theta.
-%               You should set J to the cost.
-%               Compute the partial derivatives and set grad to the partial
-%               derivatives of the cost w.r.t. each parameter in theta
-
+  ## h_theta(x)
+  ht = sigmoid (X*theta); 
 
-
-
-
+  J = -sum (y.*log (ht) + (1 - y).*log (1 - ht))/m \
+      + lambda*sum (theta(2:end).^2)/(2*m);
 
-% =============================================================
+  grad = (X'*(ht - y) + [0; lambda*theta(2:end)])/m ;
 
-end
+endfunction