斯坦福《机器学习》课程算法matlab实现之梯度下降算法——直线回归(一)
2012-12-24 16:43
991 查看
梯度下降算,可用于回归,也可用于分类,下面是该算法的最简单的演示,学习率alpha对算法的收敛影响很大,取大了不收敛,过小过大迭代次数增加;
theta=zeros(2,1);
theta(1)=0;
theta(2)=1;
times=0;
alpha=0.1;
h=0;
distant=0;
sum0=0;
sum1=0;
[row,col]=size(data);
figure;
plot(data(:,2),data(:,3),'r.');
x=-5:5;
switch(3)
case 1,
% stochastic gradient descent.
while(1)
y=theta(2)*x+theta(1);
hold on;
plot(x,y,'r');
temp0=theta(1);
temp1=theta(2);
for i=1:row
h=theta(1)+theta(2)*data(i,2);
alpha=0.2/i;
sum0=(data(i,3)-h);
sum1=(data(i,3)-h)*data(i,2);
theta(1)=theta(1)+alpha*sum0;
theta(2)=theta(2)+alpha*sum1;
end
distant=abs(theta(1)-temp0)+abs(theta(2)-temp1);
if(distant<0.001)
break;
end
times=times+1;
if(times>20)
break;
end
end
case 2,
% batch gradient descent.
while(1)
y=theta(2)*x+theta(1);
hold on;
plot(x,y,'r');
temp0=theta(1);
temp1=theta(2);
sum0=0;
sum1=0;
times=times+1;
alpha=1/(times*row);
for i=1:row
h=theta(1)+theta(2)*data(i,2);
sum0=sum0+alpha*(data(i,3)-h);
sum1=sum1+alpha*(data(i,3)-h)*data(i,2);
end
theta(1)=theta(1)+sum0;
theta(2)=theta(2)+sum1/2;
distant=abs(theta(1)-temp0)+abs(theta(2)-temp1);
if(distant<0.001)
break;
end
if(times>20)
break;
end
end
case 3,
%Least squares revisited
y=theta(2)*x+theta(1);
hold on;
plot(x,y,'r');
X=data(:,1:2);
Y=data(:,3);
theta=(X'*X)\X'*Y;
end
theta(1)
theta(2)
times
y=theta(2)*x+theta(1);
hold on;
plot(x,y,'g');
% times
clear;clc; data=genlineardata(13,-3.36,100);%生成在直线y=13-3.36*x周围的100个点存入data[100*3],data第一列全为1
theta=zeros(2,1);
theta(1)=0;
theta(2)=1;
times=0;
alpha=0.1;
h=0;
distant=0;
sum0=0;
sum1=0;
[row,col]=size(data);
figure;
plot(data(:,2),data(:,3),'r.');
x=-5:5;
switch(3)
case 1,
% stochastic gradient descent.
while(1)
y=theta(2)*x+theta(1);
hold on;
plot(x,y,'r');
temp0=theta(1);
temp1=theta(2);
for i=1:row
h=theta(1)+theta(2)*data(i,2);
alpha=0.2/i;
sum0=(data(i,3)-h);
sum1=(data(i,3)-h)*data(i,2);
theta(1)=theta(1)+alpha*sum0;
theta(2)=theta(2)+alpha*sum1;
end
distant=abs(theta(1)-temp0)+abs(theta(2)-temp1);
if(distant<0.001)
break;
end
times=times+1;
if(times>20)
break;
end
end
case 2,
% batch gradient descent.
while(1)
y=theta(2)*x+theta(1);
hold on;
plot(x,y,'r');
temp0=theta(1);
temp1=theta(2);
sum0=0;
sum1=0;
times=times+1;
alpha=1/(times*row);
for i=1:row
h=theta(1)+theta(2)*data(i,2);
sum0=sum0+alpha*(data(i,3)-h);
sum1=sum1+alpha*(data(i,3)-h)*data(i,2);
end
theta(1)=theta(1)+sum0;
theta(2)=theta(2)+sum1/2;
distant=abs(theta(1)-temp0)+abs(theta(2)-temp1);
if(distant<0.001)
break;
end
if(times>20)
break;
end
end
case 3,
%Least squares revisited
y=theta(2)*x+theta(1);
hold on;
plot(x,y,'r');
X=data(:,1:2);
Y=data(:,3);
theta=(X'*X)\X'*Y;
end
theta(1)
theta(2)
times
y=theta(2)*x+theta(1);
hold on;
plot(x,y,'g');
% times
相关文章推荐
- [置顶] 【算法 机器学习】MATLAB、R、python三种编程语言实现简单线性回归算法比较
- 斯坦福机器学习3:线性回归、梯度下降和正规方程组的matlab实现
- Stanford机器学习网络课程---第三讲(续)Matlab实现线性回归和逻辑回归: Linear Regression & Logistic Regression
- 斯坦福大学机器学习课程--逻辑回归算法
- 机器学习中的逻辑回归和线性回归的matlab程序实现
- 机器学习经典算法详解及Python实现--CART分类决策树、回归树和模型树
- 逻辑回归和朴素贝叶斯算法实现二值分类(matlab代码)
- 【Stanford|斯坦福-机器学习:线性回归-单特征梯度下降+动态图】python3实现
- (斯坦福机器学习课程笔记)多项式回归练习
- 机器学习:KNN算法(MATLAB实现)
- 机器学习之logistic回归算法的java实现
- 【机器学习】逻辑回归(matlab实现)
- (斯坦福机器学习课程笔记)局部加权线性回归练习
- 机器学习经典算法详解及Python实现--CART分类决策树、回归树和模型树
- 机器学习笔记-----AP(affinity propagat)算法讲解及matlab实现
- 斯坦福机器学习Coursera课程:第三周作业--逻辑回归
- 斯坦福机器学习课程笔记week2&3 线性/逻辑斯蒂回归
- (斯坦福机器学习课程笔记)用广义线性模型推导逻辑回归模型
- (斯坦福机器学习课程笔记)牛顿法算法学习
- 卡尔曼滤波简介及其算法实现代码(C++/C/MATLAB)