You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

ex3.m 2.1 kB

8 years ago
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869
  1. %% Machine Learning Online Class - Exercise 3 | Part 1: One-vs-all
  2. % Instructions
  3. % ------------
  4. %
  5. % This file contains code that helps you get started on the
  6. % linear exercise. You will need to complete the following functions
  7. % in this exericse:
  8. %
  9. % lrCostFunction.m (logistic regression cost function)
  10. % oneVsAll.m
  11. % predictOneVsAll.m
  12. % predict.m
  13. %
  14. % For this exercise, you will not need to change any code in this file,
  15. % or any other files other than those mentioned above.
  16. %
  17. %% Initialization
  18. clear ; close all; clc
  19. %% Setup the parameters you will use for this part of the exercise
  20. input_layer_size = 400; % 20x20 Input Images of Digits
  21. num_labels = 10; % 10 labels, from 1 to 10 ~k
  22. % (note that we have mapped "0" to label 10)
  23. %% =========== Part 1: Loading and Visualizing Data =============
  24. % We start the exercise by first loading and visualizing the dataset.
  25. % You will be working with a dataset that contains handwritten digits.
  26. %
  27. % Load Training Data
  28. fprintf('Loading and Visualizing Data ...\n')
  29. load('ex3data1.mat'); % training data stored in arrays X, y
  30. m = size(X, 1);
  31. % Randomly select 100 data points to display
  32. %rand_indices = randperm(m);
  33. %sel = X(rand_indices(1:100), :);
  34. %displayData(sel);
  35. %fprintf('Program paused. Press enter to continue.\n');
  36. %pause;
  37. %% ============ Part 2: Vectorize Logistic Regression ============
  38. % In this part of the exercise, you will reuse your logistic regression
  39. % code from the last exercise. You task here is to make sure that your
  40. % regularized logistic regression implementation is vectorized. After
  41. % that, you will implement one-vs-all classification for the handwritten
  42. % digit dataset.
  43. %
  44. fprintf('\nTraining One-vs-All Logistic Regression...\n')
  45. lambda = 0.1;
  46. [all_theta] = oneVsAll(X, y, num_labels, lambda);
  47. fprintf('Program paused. Press enter to continue.\n');
  48. pause;
  49. %% ================ Part 3: Predict for One-Vs-All ================
  50. % After ...
  51. pred = predictOneVsAll(all_theta, X);
  52. fprintf('\nTraining Set Accuracy: %f\n', mean(double(pred == y)) * 100);

机器学习

Contributors (1)