斯坦福机器学习Coursera课程:第二周作业--一元和多元线性回归
2017-11-03 15:59
531 查看
一. 一元线性回归
在Gradient Descent Algorithm中,我们利用不断推导得到两个对此算法非常重要的公式,一个是J(θ)的求解公式,另一个是θ的求解公式:一元回归中,直接使用这两个公式,来绘制J(θ)的分布曲面,以及θ的求解路径。
命题为:我们为一家连锁餐饮企业新店开张的选址进行利润估算,手中掌握了该连锁集团所辖店铺当地人口数据,及利润金额,需要使用线性回归算法来建立人口与利润的关系,进而为新店进行利润估算,以评估店铺运营前景。
首先我们将该企业的数据绘制在坐标图上,如下图所示,我们需要建立的模型是一条直线,能够在最佳程度上,拟合population与profit之间的关系。其模型为:
[align=left]在逼近θ的过程中,我们如下实现梯度下降:进行了1500次的迭代(相当于朝着最佳拟合点行走1500步),我们在1500步后,得到θ=[-3.630291,1.166362]。[/align]
下图是J(θ)的分布曲面:
接来下是我们求得的最佳θ值在等高线图上所在的位置,和上一张图其实可以重合在一起:
在作业框架基础上,主要要实现的代码部分如下 :
1. 计算成本函数 computeCost.m
function J = computeCost(X, y, theta)
%COMPUTECOST Compute cost for linear regression
% J = COMPUTECOST(X, y, theta) computes the cost of using theta as the
% parameter for linear regression to fit the data points in X and y
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
% You should set J to the cost.
h=X*theta;
e=h-y;
J=e'*e/(2*m)
% =========================================================================
end
2、梯度下降算法 gradientDescent.m
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
%GRADIENTDESCENT Performs gradient descent to learn theta
% theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by
% taking num_iters gradient steps with learning rate alpha
% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
for iter = 1:num_iters
% ====================== YOUR CODE HERE ======================
% Instructions: Perform a single gradient step on the parameter vector
% theta.
%
% Hint: While debugging, it can be useful to print out the values
% of the cost function (computeCost) and gradient here.
%
h=X*theta;
e=h-y;
theta=theta-alpha*(X'*e)/m;
% ============================================================
% Save the cost J in every iteration
J_history(iter) = computeCost(X, y, theta);
最后在octave交互窗口中输入 source ex1.m 即可输出和显示如上的图形。
本例中,根据回归得到的theta, 计算X为35, 75时的Y。从图中我们可以看出,X在[5,10]之间的训练数据比较密集,超过10的已很稀疏,这里计算35,75时的数据只能作以参考,与实际可能偏差较大。
二. 多元线性回归
题目为:使用有多个变量的线性回归来预测房屋的价格。ex1data2.txt 房屋价格的训练集. 第一列为房屋面积,第二列为房屋中的房间数量,第三列为实际的价格。预测X_p=[1650 3];的价格
1. 梯度下降法
a. 首先由于x1和x2相差太多,所以需要使用feature scalling. 这里采用mean derivation. 或者是max-minfeatureNormalize.m
function [X_norm, mu, sigma] = featureNormalize(X)
%FEATURENORMALIZE Normalizes the features in X
% FEATURENORMALIZE(X) returns a normalized version of X where
% the mean value of each feature is 0 and the standard deviation
% is 1. This is often a good preprocessing step to do when
% working with learning algorithms.
% You need to set these values correctly
X_norm = X;
mu = zeros(1, size(X, 2));
sigma = zeros(1, size(X, 2));
% ====================== YOUR CODE HERE ======================
% Instructions: First, for each feature dimension, compute the mean
% of the feature and subtract it from the dataset,
% storing the mean value in mu. Next, compute the
% standard deviation of each feature and divide
% each feature by it's standard deviation, storing
% the standard deviation in sigma.
%
% Note that X is a matrix where each column is a
% feature and each row is an example. You need
% to perform the normalization separately for
% each feature.
%
% Hint: You might find the 'mean' and 'std' functions useful.
%
m=size(X,1)
mu=mean(X);
sigma=std(X);
disp('mu'),disp(mu);
disp('sigma'),disp(sigma);
for i=1:m;
X_norm(i,:)=(X(i,:)-mu)./sigma;
end;
%disp('X_norm'), disp(X_norm);
b. 在X最前面加上一列,构成最后需要使用的X, 然后初始化alpha = 0.01; num_iters = 400;theta = zeros(3, 1);
[theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);
gradientDescentMulti.m
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)
%GRADIENTDESCENTMULTI Performs gradient descent to learn theta
% theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by
% taking num_iters gradient steps with learning rate alpha
% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
for iter = 1:num_iters
% ====================== YOUR CODE HERE ======================
% Instructions: Perform a single gradient step on the parameter vector
% theta.
%
% Hint: While debugging, it can be useful to print out the values
% of the cost function (computeCostMulti) and gradient here.
%
h=zeros(m,1);
disp('computeCostMulti');
J_history(iter)=computeCostMulti(X,y,theta);
h=X*theta;
tmp1=zeros(size(X,2),1);
for i=1:m
tmp1=tmp1+(h(i)-y(i)).*X(i,:)';
end;
theta=theta-(alpha/m)*tmp1;
disp(J_history(iter));
disp(theta);
end;
% ============================================================
% Save the cost J in every iteration
J_history(iter) = computeCostMulti(X, y, theta);
期间会调用到computeCostMulti, computeCostMulti.m
function J = computeCostMulti(X, y, theta)
%COMPUTECOSTMULTI Compute cost for linear regression with multiple variables
% J = COMPUTECOSTMULTI(X, y, theta) computes the cost of using theta as the
% parameter for linear regression to fit the data points in X and y
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
% You should set J to the cost.
h=zeros(m,1);
h=X*theta;
J=(1/(2*m))*sum((h-y).^2);
disp('J'),disp(J);
% =========================================================================
函数实现完后,还要在ex1_multi.m中找到调用梯度下降法后面的变量price,这里也要作以实现,原来只定义了变量price=0;
price = 0; % You should change this
X_P=[1650 3];
X_P=(X_P - mu)./sigma;
X_P=[1 X_P];
price = theta' * X_P'; // 这里theta和X_P都是3*1的,都转置为1*3进行乘法计算。
2. 正规方程法
normalEqn.m
function [theta] = normalEqn(X, y)
%NORMALEQN Computes the closed-form solution to linear regression
% NORMALEQN(X,y) computes the closed-form solution to linear
% regression using the normal equations.
theta = zeros(size(X, 2), 1);
% ====================== YOUR CODE HERE ======================
% Instructions: Complete the code to compute the closed form solution
% to linear regression and put the result in theta.
%
theta=pinv(X'*X)*X'*y;
% ---------------------- Sample Solution ----------------------
end
在ex1_multi.m中增加对price的处理,输入为1650和3.
price = 0; % You should change this
X_P=[1650 3];
X_P=[1 X_P];
disp(X_P');
price = theta' * X_P';
octave命令行中执行 source ex1_multi.m ,梯度下降法中成本函数和迭代次数关系如下,300次后,成本函数变化已较小,趋于平行。
这个例子中,梯度下降法和正规方程法的结果如下:
Theta computed from gradient descent:
334302.063993
100087.116006
3673.548451
Predicted price of a 1650 sq-ft, 3 br house (using gradient descent):
$289314.620338
Program paused. Press enter to continue.
Solving with normal equations...
Theta computed from the normal equations:
89597.909542
139.210674
-8738.019112
Predicted price of a 1650 sq-ft, 3 br house (using normal equations):
$293081.464335
至此运行结束,命令行中执行submit,按要求填写即可。没法上传附件,代码就不上传了。
相关文章推荐
- 斯坦福机器学习Coursera课程:第五次作业--正则多项式回归和误差分析
- Coursera吴恩达机器学习课程 第2周作业代码
- Coursera吴恩达机器学习课程 总结笔记及作业代码——第1,2周
- Coursera吴恩达机器学习课程 总结笔记及作业代码——第3周逻辑回归
- Coursera吴恩达机器学习课程 总结笔记及作业代码——第7周支持向量机
- Coursera吴恩达机器学习课程 总结笔记及作业代码——第5周神经网络续
- Coursera吴恩达机器学习课程 编程作业
- 吴恩达Coursera深度学习课程 DeepLearning第一课第二周编程作业
- Coursera吴恩达机器学习课程 总结笔记及作业代码——第6周有关机器学习的小建议
- Coursera吴恩达机器学习课程 总结笔记及作业代码——第4周神经网络
- 斯坦福机器学习Coursera课程:第六周作业--支持向量机(SVM)
- 吴恩达Coursera深度学习课程 DeepLearning.ai 编程作业——Keras tutorial - the Happy House (4.2)
- 吴恩达Coursera深度学习课程 DeepLearning.ai 编程作业——Gradients_check(2-1.3)
- 吴恩达Coursera深度学习课程 DeepLearning.ai 编程作业——Optimization Methods(2-2)
- 吴恩达机器学习课程笔记——第二周
- 吴恩达Coursera深度学习课程 DeepLearning.ai 编程作业——Tensorflow+tutorial(2-3)
- 吴恩达Coursera深度学习课程 DeepLearning.ai 编程作业——Regularization(2-1.2)
- 吴恩达Coursera深度学习课程 DeepLearning.ai 编程作业(1-4)自定义图像预测错误的解决方法
- 斯坦福机器学习课程
- 斯坦福机器学习Coursera课程:第三周作业--逻辑回归