Mercurial > hg > machine-learning-hw4
changeset 2:55430128adcd
Implement sigmoidGradient.
author | Jordi Gutiérrez Hermoso <jordigh@octave.org> |
---|---|
date | Fri, 11 Nov 2011 14:14:05 -0500 |
parents | 42b6020b2fdb |
children | 8e8089d5a55b |
files | sigmoidGradient.m |
diffstat | 1 files changed, 9 insertions(+), 30 deletions(-) [+] |
line wrap: on
line diff
--- a/sigmoidGradient.m +++ b/sigmoidGradient.m @@ -1,33 +1,12 @@ function g = sigmoidGradient(z) -%SIGMOIDGRADIENT returns the gradient of the sigmoid function -%evaluated at z -% g = SIGMOIDGRADIENT(z) computes the gradient of the sigmoid function -% evaluated at z. This should work regardless if z is a matrix or a -% vector. In particular, if z is a vector or matrix, you should return -% the gradient for each element. - -g = zeros(size(z)); - -% ====================== YOUR CODE HERE ====================== -% Instructions: Compute the gradient of the sigmoid function evaluated at -% each value of z (z can be a matrix, vector or scalar). - - - +##SIGMOIDGRADIENT returns the gradient of the sigmoid function +##evaluated at z +## g = SIGMOIDGRADIENT(z) computes the gradient of the sigmoid function +## evaluated at z. This should work regardless if z is a matrix or a +## vector. In particular, if z is a vector or matrix, you should return +## the gradient for each element. - - - - - - - + s = @(z) 1./(1 + exp(-z)); + g = s(z).*(1 - s(z)); - - -% ============================================================= - - - - -end +endfunction