您的位置:首页 > 其它

(斯坦福机器学习笔记)线性回归练习

2016-12-12 22:13 405 查看

代码均使用python3.x

题目是:

import numpy as np
import random
import matplotlib.pyplot as plt

F64 = 'float64'

def gen_linear_dot_sample(num_point):
x = list(range(num_point))
for i in range(num_point):
x[i] = x[i] / num_point
y = x.copy()
for i in range(num_point):
y[i]=y[i]+random.uniform(-0.08,0.08)
return [x,y]
[x,y]=gen_linear_dot_sample(50)


生成了50个样本点,如下图



试用线性回归求出最佳拟合的直线

=====================分割线=======================================

代码如下:

import numpy as np
import random
import matplotlib.pyplot as plt

F64 = 'float64'

def gen_linear_dot_sample(num_point):
x = list(range(num_point))
for i in range(num_point):
x[i] = x[i] / num_point
y = x.copy()
for i in range(num_point):
y[i]=y[i]+random.uniform(-0.08,0.08)
return [x,y]

def show_point(p_x,p_y):
plt.figure(1)
plt.plot(p_x,p_y,'ob')
plt.xlim(-0.2,1.2)
plt.ylim(-0.2,1.2)
plt.xlabel('x')
plt.ylabel('y')
plt.show()

def show_point_and_line(p_x,p_y,l_x,l_y):
plt.figure(1)
plt.plot(p_x,p_y,'ob')
plt.plot(l_x,l_y,'r')
plt.xlim(-0.2,1.2)
plt.ylim(-0.2,1.2)
plt.xlabel('x')
plt.ylabel('y')
plt.show()

def linear_re(x,y,study_rate,times_iteration):
y=y.reshape(y.shape[0],1).astype(F64)
if x.ndim == 1:
x=x.reshape((x.shape[0],1))
one_mat = np.ones((x.shape[0],1),dtype=F64)
x_add_one = np.hstack((one_mat,x))
theta = np.random.random((x.shape[1]+1))
theta=theta.astype(F64).reshape(x.shape[1]+1,1)
theta_init=theta
print('初始化后,theta的值是:','b=',theta[0,0],'k=',theta[1,0])
for i in range(times_iteration):
diff_cost = ((np.dot(x_add_one,theta)-y)*x_add_one)
diff_cost=diff_cost.sum(0,dtype=F64,keepdims=1).transpose()
diff_cost_mean = diff_cost/x.shape[0]
theta = theta-diff_cost_mean*study_rate
print('迭代后得到的theta的值是:','b=',theta[0,0],'k=',theta[1,0])
return [theta_init,theta]

[x,y]=gen_linear_dot_sample(50)
[theta_init,theta]=linear_re(np.asarray(x,dtype=F64),np.asarray(y,dtype=F64),0.25,200)
show_point_and_line(x,y,x,theta_init[0]+theta_init[1]*x)
show_point_and_line(x,y,x,theta[0]+theta[1]*x)


初始化后,得到的直线是这样



线性回归后,得到直线



初始化后,theta的值是: b= 0.134382446993 k= 0.553982228195

迭代后得到的theta的值是: b= 0.00249451906385 k= 0.973867674154
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: 
相关文章推荐