comparison linearRegCostFunction.m @ 1:9a9f76850dc6

Implement linearRegCostFunction
author Jordi Gutiérrez Hermoso <jordigh@octave.org>
date Sun, 20 Nov 2011 23:42:47 -0500
parents 0f14514e907f
children
comparison
equal deleted inserted replaced
0:0f14514e907f 1:9a9f76850dc6
1 function [J, grad] = linearRegCostFunction(X, y, theta, lambda) 1 function [J, grad] = linearRegCostFunction(X, y, theta, lambda)
2 %LINEARREGCOSTFUNCTION Compute cost and gradient for regularized linear 2 ##LINEARREGCOSTFUNCTION Compute cost and gradient for regularized linear
3 %regression with multiple variables 3 ##regression with multiple variables
4 % [J, grad] = LINEARREGCOSTFUNCTION(X, y, theta, lambda) computes the 4 ## [J, grad] = LINEARREGCOSTFUNCTION(X, y, theta, lambda) computes the
5 % cost of using theta as the parameter for linear regression to fit the 5 ## cost of using theta as the parameter for linear regression to fit the
6 % data points in X and y. Returns the cost in J and the gradient in grad 6 ## data points in X and y. Returns the cost in J and the gradient in grad
7 7
8 % Initialize some useful values 8 m = length (y);
9 m = length(y); % number of training examples 9 ht = X*theta;
10 J = (sumsq (ht - y) + lambda*sumsq (theta(2:end)))/(2*m);
10 11
11 % You need to return the following variables correctly 12 grad = (X'*(ht - y) + [0; lambda*theta(2:end)])/m;
12 J = 0;
13 grad = zeros(size(theta));
14 13
15 % ====================== YOUR CODE HERE ====================== 14 endfunction
16 % Instructions: Compute the cost and gradient of regularized linear
17 % regression for a particular choice of theta.
18 %
19 % You should set J to the cost and grad to the gradient.
20 %
21
22
23
24
25
26
27
28
29
30
31
32
33 % =========================================================================
34
35 grad = grad(:);
36
37 end