You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

ex2.m 3.7 kB

8 years ago
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135
  1. %% Machine Learning Online Class - Exercise 2: Logistic Regression
  2. %
  3. % Instructions
  4. % ------------
  5. %
  6. % This file contains code that helps you get started on the logistic
  7. % regression exercise. You will need to complete the following functions
  8. % in this exericse:
  9. %
  10. % sigmoid.m
  11. % costFunction.m
  12. % predict.m
  13. % costFunctionReg.m
  14. %
  15. % For this exercise, you will not need to change any code in this file,
  16. % or any other files other than those mentioned above.
  17. %
  18. %% Initialization
  19. clear ; close all; clc
  20. %% Load Data
  21. % The first two columns contains the exam scores and the third column
  22. % contains the label.
  23. data = load('ex2data1.txt');
  24. X = data(:, [1, 2]); y = data(:, 3);
  25. %% ==================== Part 1: Plotting ====================
  26. % We start the exercise by first plotting the data to understand the
  27. % the problem we are working with.
  28. fprintf(['Plotting data with + indicating (y = 1) examples and o ' ...
  29. 'indicating (y = 0) examples.\n']);
  30. plotData(X, y);
  31. % Put some labels
  32. hold on;
  33. % Labels and Legend
  34. xlabel('Exam 1 score')
  35. ylabel('Exam 2 score')
  36. % Specified in plot order
  37. legend('Admitted', 'Not admitted')
  38. hold off;
  39. fprintf('\nProgram paused. Press enter to continue.\n');
  40. pause;
  41. %% ============ Part 2: Compute Cost and Gradient ============
  42. % In this part of the exercise, you will implement the cost and gradient
  43. % for logistic regression. You neeed to complete the code in
  44. % costFunction.m
  45. % Setup the data matrix appropriately, and add ones for the intercept term
  46. [m, n] = size(X);
  47. % Add intercept term to x and X_test
  48. X = [ones(m, 1) X];
  49. % Initialize fitting parameters
  50. initial_theta = zeros(n + 1, 1);
  51. % Compute and display initial cost and gradient
  52. [cost, grad] = costFunction(initial_theta, X, y);
  53. fprintf('Cost at initial theta (zeros): %f\n', cost);
  54. fprintf('Gradient at initial theta (zeros): \n');
  55. fprintf(' %f \n', grad);
  56. fprintf('\nProgram paused. Press enter to continue.\n');
  57. pause;
  58. %% ============= Part 3: Optimizing using fminunc =============
  59. % In this exercise, you will use a built-in function (fminunc) to find the
  60. % optimal parameters theta.
  61. % Set options for fminunc
  62. options = optimset('GradObj', 'on', 'MaxIter', 400);
  63. % Run fminunc to obtain the optimal theta
  64. % This function will return theta and the cost
  65. [theta, cost] = ...
  66. fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);
  67. % Print theta to screen
  68. fprintf('Cost at theta found by fminunc: %f\n', cost);
  69. fprintf('theta: \n');
  70. fprintf(' %f \n', theta);
  71. % Plot Boundary
  72. plotDecisionBoundary(theta, X, y);
  73. % Put some labels
  74. hold on;
  75. % Labels and Legend
  76. xlabel('Exam 1 score')
  77. ylabel('Exam 2 score')
  78. % Specified in plot order
  79. legend('Admitted', 'Not admitted')
  80. hold off;
  81. fprintf('\nProgram paused. Press enter to continue.\n');
  82. pause;
  83. %% ============== Part 4: Predict and Accuracies ==============
  84. % After learning the parameters, you'll like to use it to predict the outcomes
  85. % on unseen data. In this part, you will use the logistic regression model
  86. % to predict the probability that a student with score 45 on exam 1 and
  87. % score 85 on exam 2 will be admitted.
  88. %
  89. % Furthermore, you will compute the training and test set accuracies of
  90. % our model.
  91. %
  92. % Your task is to complete the code in predict.m
  93. % Predict probability for a student with score 45 on exam 1
  94. % and score 85 on exam 2
  95. prob = sigmoid([1 45 85] * theta);
  96. fprintf(['For a student with scores 45 and 85, we predict an admission ' ...
  97. 'probability of %f\n\n'], prob);
  98. % Compute accuracy on our training set
  99. p = predict(theta, X);
  100. fprintf('Train Accuracy: %f\n', mean(double(p == y)) * 100);
  101. fprintf('\nProgram paused. Press enter to continue.\n');
  102. pause;

机器学习

Contributors (1)