view linearRegCostFunction.m @ 1:9a9f76850dc6

Implement linearRegCostFunction
author Jordi Gutiérrez Hermoso <jordigh@octave.org>
date Sun, 20 Nov 2011 23:42:47 -0500
parents 0f14514e907f
children
line wrap: on
line source

function [J, grad] = linearRegCostFunction(X, y, theta, lambda)
  ##LINEARREGCOSTFUNCTION Compute cost and gradient for regularized linear 
  ##regression with multiple variables
  ##   [J, grad] = LINEARREGCOSTFUNCTION(X, y, theta, lambda) computes the 
  ##   cost of using theta as the parameter for linear regression to fit the 
  ##   data points in X and y. Returns the cost in J and the gradient in grad

  m = length (y);
  ht = X*theta;
  J = (sumsq (ht - y) + lambda*sumsq (theta(2:end)))/(2*m);

  grad = (X'*(ht - y) + [0; lambda*theta(2:end)])/m;

endfunction