Mercurial > hg > machine-learning-hw2
comparison ex2.m @ 0:5664e0047b3e
Initial commit
author | Jordi Gutiérrez Hermoso <jordigh@octave.org> |
---|---|
date | Sat, 29 Oct 2011 20:37:50 -0500 |
parents | |
children |
comparison
equal
deleted
inserted
replaced
-1:000000000000 | 0:5664e0047b3e |
---|---|
1 %% Machine Learning Online Class - Exercise 2: Logistic Regression | |
2 % | |
3 % Instructions | |
4 % ------------ | |
5 % | |
6 % This file contains code that helps you get started on the logistic | |
7 % regression exercise. You will need to complete the following functions | |
8 % in this exericse: | |
9 % | |
10 % sigmoid.m | |
11 % costFunction.m | |
12 % predict.m | |
13 % costFunctionReg.m | |
14 % | |
15 % For this exercise, you will not need to change any code in this file, | |
16 % or any other files other than those mentioned above. | |
17 % | |
18 | |
19 %% Initialization | |
20 clear ; close all; clc | |
21 | |
22 %% Load Data | |
23 % The first two columns contains the exam scores and the third column | |
24 % contains the label. | |
25 | |
26 data = load('ex2data1.txt'); | |
27 X = data(:, [1, 2]); y = data(:, 3); | |
28 | |
29 %% ==================== Part 1: Plotting ==================== | |
30 % We start the exercise by first plotting the data to understand the | |
31 % the problem we are working with. | |
32 | |
33 fprintf(['Plotting data with + indicating (y = 1) examples and o ' ... | |
34 'indicating (y = 0) examples.\n']); | |
35 | |
36 plotData(X, y); | |
37 | |
38 % Put some labels | |
39 hold on; | |
40 % Labels and Legend | |
41 xlabel('Exam 1 score') | |
42 ylabel('Exam 2 score') | |
43 | |
44 % Specified in plot order | |
45 legend('Admitted', 'Not admitted') | |
46 hold off; | |
47 | |
48 fprintf('\nProgram paused. Press enter to continue.\n'); | |
49 pause; | |
50 | |
51 | |
52 %% ============ Part 2: Compute Cost and Gradient ============ | |
53 % In this part of the exercise, you will implement the cost and gradient | |
54 % for logistic regression. You neeed to complete the code in | |
55 % costFunction.m | |
56 | |
57 % Setup the data matrix appropriately, and add ones for the intercept term | |
58 [m, n] = size(X); | |
59 | |
60 % Add intercept term to x and X_test | |
61 X = [ones(m, 1) X]; | |
62 | |
63 % Initialize fitting parameters | |
64 initial_theta = zeros(n + 1, 1); | |
65 | |
66 % Compute and display initial cost and gradient | |
67 [cost, grad] = costFunction(initial_theta, X, y); | |
68 | |
69 fprintf('Cost at initial theta (zeros): %f\n', cost); | |
70 fprintf('Gradient at initial theta (zeros): \n'); | |
71 fprintf(' %f \n', grad); | |
72 | |
73 fprintf('\nProgram paused. Press enter to continue.\n'); | |
74 pause; | |
75 | |
76 | |
77 %% ============= Part 3: Optimizing using fminunc ============= | |
78 % In this exercise, you will use a built-in function (fminunc) to find the | |
79 % optimal parameters theta. | |
80 | |
81 % Set options for fminunc | |
82 options = optimset('GradObj', 'on', 'MaxIter', 400); | |
83 | |
84 % Run fminunc to obtain the optimal theta | |
85 % This function will return theta and the cost | |
86 [theta, cost] = ... | |
87 fminunc(@(t)(costFunction(t, X, y)), initial_theta, options); | |
88 | |
89 % Print theta to screen | |
90 fprintf('Cost at theta found by fminunc: %f\n', cost); | |
91 fprintf('theta: \n'); | |
92 fprintf(' %f \n', theta); | |
93 | |
94 % Plot Boundary | |
95 plotDecisionBoundary(theta, X, y); | |
96 | |
97 % Put some labels | |
98 hold on; | |
99 % Labels and Legend | |
100 xlabel('Exam 1 score') | |
101 ylabel('Exam 2 score') | |
102 | |
103 % Specified in plot order | |
104 legend('Admitted', 'Not admitted') | |
105 hold off; | |
106 | |
107 fprintf('\nProgram paused. Press enter to continue.\n'); | |
108 pause; | |
109 | |
110 %% ============== Part 4: Predict and Accuracies ============== | |
111 % After learning the parameters, you'll like to use it to predict the outcomes | |
112 % on unseen data. In this part, you will use the logistic regression model | |
113 % to predict the probability that a student with score 20 on exam 1 and | |
114 % score 80 on exam 2 will be admitted. | |
115 % | |
116 % Furthermore, you will compute the training and test set accuracies of | |
117 % our model. | |
118 % | |
119 % Your task is to complete the code in predict.m | |
120 | |
121 % Predict probability for a student with score 45 on exam 1 | |
122 % and score 85 on exam 2 | |
123 | |
124 prob = sigmoid([1 45 85] * theta); | |
125 fprintf(['For a student with scores 45 and 85, we predict an admission ' ... | |
126 'probability of %f\n\n'], prob); | |
127 | |
128 % Compute accuracy on our training set | |
129 p = predict(theta, X); | |
130 | |
131 fprintf('Train Accuracy: %f\n', mean(double(p == y)) * 100); | |
132 | |
133 fprintf('\nProgram paused. Press enter to continue.\n'); | |
134 pause; | |
135 |