view linearRegCostFunction.m @ 5:eddd33e57f6a default tip

Justify loop in trainLinearReg
author Jordi Gutiérrez Hermoso <jordigh@octave.org>
date Sun, 27 Nov 2011 15:58:14 -0500
parents 9a9f76850dc6
children
line wrap: on
line source

function [J, grad] = linearRegCostFunction(X, y, theta, lambda)
  ##LINEARREGCOSTFUNCTION Compute cost and gradient for regularized linear 
  ##regression with multiple variables
  ##   [J, grad] = LINEARREGCOSTFUNCTION(X, y, theta, lambda) computes the 
  ##   cost of using theta as the parameter for linear regression to fit the 
  ##   data points in X and y. Returns the cost in J and the gradient in grad

  m = length (y);
  ht = X*theta;
  J = (sumsq (ht - y) + lambda*sumsq (theta(2:end)))/(2*m);

  grad = (X'*(ht - y) + [0; lambda*theta(2:end)])/m;

endfunction