Machine Learning week 3 quiz: programming assignment-Logistic Regression
生活随笔
收集整理的這篇文章主要介紹了
Machine Learning week 3 quiz: programming assignment-Logistic Regression
小編覺(jué)得挺不錯(cuò)的,現(xiàn)在分享給大家,幫大家做個(gè)參考.
一、ex2.m: the main .m file to call other function files
% matlab%% Machine Learning Online Class - Exercise 2: Logistic Regression % % Instructions % ------------ % % This file contains code that helps you get started on the logistic % regression exercise. You will need to complete the following functions % in this exericse: % % sigmoid.m % costFunction.m % predict.m % costFunctionReg.m % % For this exercise, you will not need to change any code in this file, % or any other files other than those mentioned above. %%% Initialization clear ; close all; clc%% Load Data % The first two columns contains the exam scores and the third column % contains the label.data = load('ex2data1.txt'); X = data(:, [1, 2]); y = data(:, 3);%% ==================== Part 1: Plotting ==================== % We start the exercise by first plotting the data to understand the % the problem we are working with.fprintf(['Plotting data with + indicating (y = 1) examples and o ' ...'indicating (y = 0) examples.\n']);plotData(X, y);% Put some labels hold on; % Labels and Legend xlabel('Exam 1 score') ylabel('Exam 2 score')% Specified in plot order legend('Admitted', 'Not admitted') hold off;fprintf('\nProgram paused. Press enter to continue.\n'); pause;%% ============ Part 2: Compute Cost and Gradient ============ % In this part of the exercise, you will implement the cost and gradient % for logistic regression. You neeed to complete the code in % costFunction.m% Setup the data matrix appropriately, and add ones for the intercept term [m, n] = size(X);% Add intercept term to x and X_test X = [ones(m, 1) X]; %m*(n+1)% Initialize fitting parameters initial_theta = zeros(n + 1, 1); %(n+1)*1% Compute and display initial cost and gradient [cost, grad] = costFunction(initial_theta, X, y);fprintf('Cost at initial theta (zeros): %f\n', cost); fprintf('Gradient at initial theta (zeros): \n'); fprintf(' %f \n', grad);fprintf('\nProgram paused. Press enter to continue.\n'); pause;%% ============= Part 3: Optimizing using fminunc ============= % In this exercise, you will use a built-in function (fminunc) to find the % optimal parameters theta.% Set options for fminunc options = optimset('GradObj', 'on', 'MaxIter', 400);% Run fminunc to obtain the optimal theta % This function will return theta and the cost [theta, cost] = ...fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);% Print theta to screen fprintf('Cost at theta found by fminunc: %f\n', cost); fprintf('theta: \n'); fprintf(' %f \n', theta);% Plot Boundary plotDecisionBoundary(theta, X, y);% Put some labels hold on; % Labels and Legend xlabel('Exam 1 score') ylabel('Exam 2 score')% Specified in plot order legend('Admitted', 'Not admitted') hold off;fprintf('\nProgram paused. Press enter to continue.\n'); pause;%% ============== Part 4: Predict and Accuracies ============== % After learning the parameters, you'll like to use it to predict the outcomes % on unseen data. In this part, you will use the logistic regression model % to predict the probability that a student with score 45 on exam 1 and % score 85 on exam 2 will be admitted. % % Furthermore, you will compute the training and test set accuracies of % our model. % % Your task is to complete the code in predict.m% Predict probability for a student with score 45 on exam 1 % and score 85 on exam 2 prob = sigmoid([1 45 85] * theta); fprintf(['For a student with scores 45 and 85, we predict an admission ' ...'probability of %f\n\n'], prob);% Compute accuracy on our training set p = predict(theta, X);fprintf('Train Accuracy: %f\n', mean(double(p == y)) * 100);fprintf('\nProgram paused. Press enter to continue.\n'); pause;
二、costFunction.m
function [J, grad] = costFunction(theta, X, y) %COSTFUNCTION Compute cost and gradient for logistic regression % J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the % parameter for logistic regression and the gradient of the cost % w.r.t. to the parameters.% Initialize some useful values m = length(y); % number of training examples % You need to return the following variables correctly J = 0; %1*1 grad = zeros(size(theta)); %(n+1)*1% ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta. % You should set J to the cost. % Compute the partial derivatives and set grad to the partial % derivatives of the cost w.r.t. each parameter in theta % % Note: grad should have the same dimensions as theta %h = sigmoid(X*theta); %m*1 part1 = y.*(log(h)); %m*1 part2 = (1-y).*(log(1-h)); %m*1J = sum(-part1 - part2) / m; %1*1diff = h - y; %m*1 temp = X' * diff; % (n+1)*m × m*1 -> (n+1)*1 temp = temp / m; % (n+1)*1;grad = temp;% =============================================================end
三、predict.m
function p = predict(theta, X) %PREDICT Predict whether the label is 0 or 1 using learned logistic %regression parameters theta % p = PREDICT(theta, X) computes the predictions for X using a % threshold at 0.5 (i.e., if sigmoid(theta'*x) >= 0.5, predict 1)m = size(X, 1); % Number of training examples %m% You need to return the following variables correctly p = zeros(m, 1); %m*1% ====================== YOUR CODE HERE ====================== % Instructions: Complete the following code to make predictions using % your learned logistic regression parameters. % You should set p to a vector of 0's and 1's %h = X * theta; % m*(n+1) × (n+1)*1 -> m*1 g = sigmoid(h); % m*1 p = g >= 0.5;% =========================================================================end四、costFunctionReg.m
function [J, grad] = costFunctionReg(theta, X, y, lambda) %COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization % J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using % theta as the parameter for regularized logistic regression and the % gradient of the cost w.r.t. to the parameters. % Initialize some useful values m = length(y); % number of training examples % m% You need to return the following variables correctly J = 0; % 1*1 grad = zeros(size(theta)); % (n+1)*1% ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta. % You should set J to the cost. % Compute the partial derivatives and set grad to the partial % derivatives of the cost w.r.t. each parameter in theta[J_ori, grad_ori] = costFunction(theta, X, y); sz_theta = size(theta, 1); theta_temp = theta(2:sz_theta); punish_J = sum(theta_temp.^2)*lambda/2/m; J = J_ori + punish_J;%--- grad punish_theta = theta_temp*lambda/m; punish_theta = [0; punish_theta]; grad = grad_ori + punish_theta;% =============================================================end
五、submit results
總結(jié)
以上是生活随笔為你收集整理的Machine Learning week 3 quiz: programming assignment-Logistic Regression的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問(wèn)題。
- 上一篇: 庖丁解牛TLD
- 下一篇: Machine Learning wee