您的位置:首页 > 其它

Linear regression with one variable算法实例讲解(绘制图像,cost_Function ,Gradient Desent, 拟合曲线, 轮廓图绘制)_矩阵操作

2016-08-28 11:41 1051 查看
%测试数据 'ex1data1.txt', 第一列为 population of City in 10,000s, 第二列为 Profit in $10,000s
1 6.1101,17.592
5.5277,9.1302
8.5186,13.662
7.0032,11.854
5.8598,6.8233
8.3829,11.886
7.4764,4.3483
8.5781,12
6.4862,6.5987
5.0546,3.8166
5.7107,3.2522
14.164,15.505
5.734,3.1551
8.4084,7.2258
5.6407,0.71618
5.3794,3.5129
6.3654,5.3048
5.1301,0.56077
6.4296,3.6518
7.0708,5.3893
6.1891,3.1386
20.27,21.767
5.4901,4.263
6.3261,5.1875
5.5649,3.0825
18.945,22.638
12.828,13.501
10.957,7.0467
13.176,14.692
22.203,24.147
5.2524,-1.22
6.5894,5.9966
9.2482,12.134
5.8918,1.8495
8.2111,6.5426
7.9334,4.5623
8.0959,4.1164
5.6063,3.3928
12.836,10.117
6.3534,5.4974
5.4069,0.55657
6.8825,3.9115
11.708,5.3854
5.7737,2.4406
7.8247,6.7318
7.0931,1.0463
5.0702,5.1337
5.8014,1.844
11.7,8.0043
5.5416,1.0179
7.5402,6.7504
5.3077,1.8396
7.4239,4.2885
7.6031,4.9981
6.3328,1.4233
6.3589,-1.4211
6.2742,2.4756
5.6397,4.6042
9.3102,3.9624
9.4536,5.4141
8.8254,5.1694
5.1793,-0.74279
21.279,17.929
14.908,12.054
18.959,17.054
7.2182,4.8852
8.2951,5.7442
10.236,7.7754
5.4994,1.0173
20.341,20.992
10.136,6.6799
7.3345,4.0259
6.0062,1.2784
7.2259,3.3411
5.0269,-2.6807
6.5479,0.29678
7.5386,3.8845
5.0365,5.7014
10.274,6.7526
5.1077,2.0576
5.7292,0.47953
5.1884,0.20421
6.3557,0.67861
9.7687,7.5435
6.5159,5.3436
8.5172,4.2415
9.1802,6.7981
6.002,0.92695
5.5204,0.152
5.0594,2.8214
5.7077,1.8451
7.6366,4.2959
5.8707,7.2029
5.3054,1.9869
8.2934,0.14454
13.394,9.0551
5.4369,0.61705


%绘制实际数据图像——人口和利润的关系图
fprintf('Plotting Data ...\n')
data = load('ex1data1.txt');
X = data(:, 1); y = data(:, 2);
m = length(y); % number of training examples

% Plot Data
% Note: You have to complete the code in plotData.m
plotData(X, y);

fprintf('Program paused. Press enter to continue.\n');
pause;


%plotData()函数实现

function plotData(x, y)

figure;                                           % open a new figure window
plot(x, y, 'rx', 'MarkerSize', 10);     %Set the size of Points('MarkerSize', 10)
ylabel('profit in $10,1000s');
xlabel('population of City in 10,000s');

end


 %% =================== Part 3: Gradient descent

fprintf('Running Gradient Descent ...\n')

X = [ones(m, 1), data(:,1)]; % Add a column of ones to x

theta = zeros(2, 1); % initialize fitting parameters

% Some gradient descent settings
iterations = 1500;       %迭代次数
alpha = 0.01;              %learning rate

% compute and display initial cost
computeCost(X, y, theta)      %y是真实的值


 % Compute Cost for linear regression
% cost Function函数实现___利用矩阵操作进行!!
 function J = computeCost(X, y, theta)

% Initialize some useful values
m = length(y); % number of training examples
J = 0;

% Instructions: Compute the cost of a particular choice of theta
%               You should set J to the cost.

% X = [ ones(m, 1), data(:, 1) ], theta = [ th1; th2]
predictions = X * theta;             %矩阵操作--预测函数
sqrError = (predictions - y).^2;
J = sum(sqrError) / (2*m);

end


%运行梯度下降算法
 % run gradient descent

theta = gradientDescent(X, y, theta, alpha, iterations);

% print theta to screen
fprintf('Theta found by gradient descent: ');
fprintf('%f %f \n', theta(1), theta(2));


1 %梯度下降算法实现 gradientDescent(X, y, theta, alpha, iterations)
%X-training example,y-实际数值,alpha-learning rate
 function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)

%   theta = GRADIENTDESENT(X, y, theta, alpha, num_iters) updates theta by
%   taking num_iters gradient steps with learning rate alpha

% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);

for iter = 1:num_iters
predictions = X * theta;         %预测值h(xi)--利用了矩阵运算
sqrError = (predictions - y);    %预测值 - 实际值

     % Simultaneously update(同时更新thetaj) thetaj for all j.
% alpha - learning rate, '.*'---是内积(矩阵对应元素相乘)
     theta1 = theta(1) - alpha * (1/m) * sum(sqrError .* X(:,1));
theta2 = theta(2) - alpha * (1/m) * sum(sqrError .* X(:,2));
theta(1) = theta1;
theta(2) = theta2;

% Save the cost J in every iteration
J_history(iter) = computeCost(X, y, theta);

%disp(J_history);  %增加输出语句,方便调试

end

end


1 %绘制拟合曲线
 % Plot the linear fit
hold on;              % keep previous plot visible
plot(X(:,2), X*theta, '-')
legend('Training data', 'Linear regression')    %添加图例
hold off              % don't overlay any more plots on this figure


 1 % Predict values for population sizes of 35,000 and 70,000
2 %利用求出的拟合参数--预测新值,利用矩阵运算
 predict1 = [1, 3.5] *theta;
fprintf('For population = 35,000, we predict a profit of %f\n',...
predict1*10000);

predict2 = [1, 7] * theta;
fprintf('For population = 70,000, we predict a profit of %f\n',...
predict2*10000);

fprintf('Program paused. Press enter to continue.\n');
pause;


 1 %计算不同 theta参数下, J(θ)值的变化, 绘制图像
2 %% ============= Part 4: Visualizing J(theta_0, theta_1) =============

fprintf('Visualizing J(theta_0, theta_1) ...\n')

% Grid over which we will calculate J
%linspace(x, y, n)--在(x,y)区间内均匀生成n个数
 theta0_vals = linspace(-10, 10, 100);
theta1_vals = linspace(-1, 4, 100);

% initialize J_vals to a matrix of 0's
J_vals = zeros(length(theta0_vals), length(theta1_vals));

% Fill out J_vals
 for i = 1:length(theta0_vals)
for j = 1:length(theta1_vals)
t = [theta0_vals(i); theta1_vals(j)];
J_vals(i,j) = computeCost(X, y, t);
end
end

% Because of the way meshgrids work in the surf command, we need to
% transpose J_vals before calling surf, or else the axes will be flipped

J_vals = J_vals';
% Surface plot
figure;

%surf(X,Y,Z)--creates the surface plot from corresponding(对应值) value in X, Y,Z (default: color is proportional(成正比) to surface height.)

surf(theta0_vals, theta1_vals, J_vals)
xlabel('\theta_0'); ylabel('\theta_1');


% Contour plot----轮廓图的绘制
 figure;

% Plot J_vals as 15 contours spaced logarithmically between 0.01 and 100

contour(theta0_vals, theta1_vals, J_vals, logspace(-2, 3, 20))
xlabel('\theta_0'); ylabel('\theta_1');
hold on;
plot(theta(1), theta(2), 'rx', 'MarkerSize', 10, 'LineWidth', 2);







绘图效果如上。
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: 
相关文章推荐