matlab(3) Logistic Regression: 求cost 和gradient \ 求sigmoid的值
2015-09-21 16:56
609 查看
sigmoid.m文件
function g = sigmoid(z)
%SIGMOID Compute sigmoid functoon
% J = SIGMOID(z) computes the sigmoid of z.
g = zeros(size(z)); 初始化g ,z可以是一个数,一个向量或者一个矩阵
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the sigmoid of each value of z (z can be a matrix, vector or scalar)
Ones = ones(size(z));
g = Ones./(Ones + exp((-1).*z)); 计算
,
g(z)的值域在[0,1]之间,符合概率的分布.
当z=0时,g=0.5; 当z<0时,g<0.5;当z>0时,g>0.5;
当z->-∞时,g->0; 当z->+∞时,g->1
z可以是一个数,一个向量或者是一个矩阵
% =============================================================
end
costFunction.m
function [J, grad] = costFunction(theta, X, y)
%COSTFUNCTION Compute cost and gradient for logistic regression
% J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the
% parameter for logistic regression and the gradient of the cost
% w.r.t. to the parameters.
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
grad = zeros(size(theta)); %grad的维数与theta的一至
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
%
% Note: grad should have the same dimensions as theta
%
J(θ)的表达式
grad的表达式
J = 1/m*(-1*y'*log(sigmoid(X*theta)) - (ones(1,m)-y')*log(ones(m,1)-sigmoid(X*theta))); %logM是对矩阵的每个元素都是求log, exp(M)同样是表示对矩阵的每 个元素求e的底
调用的函数参见上述函数sigmoid.m
grad = 1/m * (X' * (sigmoid(X*theta) - y));,
% =============================================================
end
%% ============ Part 2: Compute Cost and Gradient ============
% In this part of the exercise, you will implement the cost and gradient
% for logistic regression. You neeed to complete the code in
% costFunction.m
% Setup the data matrix appropriately, and add ones for the intercept term
[m, n] = size(X); %求x矩阵的维数
% Add intercept term to x and X_test
X = [ones(m, 1) X]; %X矩阵左侧加一列1,用来匹配常数量
% Initialize fitting parameters
initial_theta = zeros(n + 1, 1);
% Compute and display initial cost and gradient
[cost, grad] = costFunction(initial_theta, X, y); %参见上述文件costFunction.m
fprintf('Cost at initial theta (zeros): %f\n', cost);
fprintf('Gradient at initial theta (zeros): \n');
fprintf(' %f \n', grad);
fprintf('\nProgram paused. Press enter to continue.\n');
pause;
function g = sigmoid(z)
%SIGMOID Compute sigmoid functoon
% J = SIGMOID(z) computes the sigmoid of z.
g = zeros(size(z)); 初始化g ,z可以是一个数,一个向量或者一个矩阵
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the sigmoid of each value of z (z can be a matrix, vector or scalar)
Ones = ones(size(z));
g = Ones./(Ones + exp((-1).*z)); 计算
,
g(z)的值域在[0,1]之间,符合概率的分布.
当z=0时,g=0.5; 当z<0时,g<0.5;当z>0时,g>0.5;
当z->-∞时,g->0; 当z->+∞时,g->1
z可以是一个数,一个向量或者是一个矩阵
% =============================================================
end
costFunction.m
function [J, grad] = costFunction(theta, X, y)
%COSTFUNCTION Compute cost and gradient for logistic regression
% J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the
% parameter for logistic regression and the gradient of the cost
% w.r.t. to the parameters.
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
grad = zeros(size(theta)); %grad的维数与theta的一至
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
%
% Note: grad should have the same dimensions as theta
%
J(θ)的表达式
grad的表达式
J = 1/m*(-1*y'*log(sigmoid(X*theta)) - (ones(1,m)-y')*log(ones(m,1)-sigmoid(X*theta))); %logM是对矩阵的每个元素都是求log, exp(M)同样是表示对矩阵的每 个元素求e的底
调用的函数参见上述函数sigmoid.m
grad = 1/m * (X' * (sigmoid(X*theta) - y));,
% =============================================================
end
%% ============ Part 2: Compute Cost and Gradient ============
% In this part of the exercise, you will implement the cost and gradient
% for logistic regression. You neeed to complete the code in
% costFunction.m
% Setup the data matrix appropriately, and add ones for the intercept term
[m, n] = size(X); %求x矩阵的维数
% Add intercept term to x and X_test
X = [ones(m, 1) X]; %X矩阵左侧加一列1,用来匹配常数量
% Initialize fitting parameters
initial_theta = zeros(n + 1, 1);
% Compute and display initial cost and gradient
[cost, grad] = costFunction(initial_theta, X, y); %参见上述文件costFunction.m
fprintf('Cost at initial theta (zeros): %f\n', cost);
fprintf('Gradient at initial theta (zeros): \n');
fprintf(' %f \n', grad);
fprintf('\nProgram paused. Press enter to continue.\n');
pause;
相关文章推荐
- matlab 图像保存函数及使用方法
- matlab | 与 || 的区别
- 我的Matlab 图像方面的学习历程(一)基础知识点
- 在matlab上使用libsvm工具箱使用错误及解决方法汇总
- 埃奇沃思de盒子
- Matlab绘制透明平面(二元函数)
- Matlab十进制整数转换成二级制补码
- matlab第一周学习总结
- MATLAB中axes函数
- Matlab设置网格线密度(坐标精度)
- matlab练习程序(Moravec算子)
- MATLAB日常学习记录
- 【Matlab学习笔记】【函数学习】size函数—图像的宽与高的获取
- Matlab与C++混合编程(依赖OpenCV)
- Matlab导出图片--高分辨率且保持线形可辨认
- Matlab 编程备忘与总结
- 一个谱聚类MATLAB实现分享
- MATLAB工具箱
- matlab实现简单的交互式程序
- 【备忘】Matlab 作图奇淫技巧