您的位置:首页 > 理论基础 > 计算机网络

【机器学习】动手写一个全连接神经网络(二):线性回归

2017-01-11 00:34 351 查看
我们来用python写一个没有正则化的线性回归神经网络。

传统的线性回归方法是这样的,正则化线性回归是这样的。

下面放出一个无正则化双隐藏层线性回归神经网络代码:

import numpy as np
#双变量
X1 = np.mat(np.random.randn(100)*100-50).T
X2 = np.mat(np.random.randn(100)*100-50).T
X=np.column_stack((X1,X2))
y = X1 * 3.0 + X2 * 2.0
trainData=np.column_stack((X1,X2,y))
#网络参数
lr1 = 0.000001
lr2 = 0.000001
inputDim = 2
fc1Dim = 20
fc2Dim = 1
batchSize = 1
W1 = np.random.randn(inputDim,fc1Dim)#1x20
b1 = np.zeros((1,fc1Dim))#1x20
W2 = np.random.randn(fc1Dim, fc2Dim)#20x1
b2 = np.zeros((1,fc2Dim)) #10x1

def forward(x):
fc1=np.dot(x,W1)+b1
sig1 = fc1#你可以添加sigmoid激活层
fc2=np.dot(sig1,W2)+b2
return fc1,sig1,fc2

def backward(x,fc1,sig1,fc2,loss,W1,b1,W2,b2):
dW2 = np.dot(sig1.T,loss)#nx1->1xn
db2 = loss
dfc2up = np.dot(loss,W2.T)#1xn->nx1
dsigup = dfc2up
dW1 = np.dot(x.T,dsigup)#nx1
db1 = dsigup#nx1
W2 -= lr1 * dW2
b2 -= lr1 * db2
W1 -= lr2 * dW1
b1 -= lr2 * db1
return W1,b1,W2,b2

def BuildBatch(batchSize):#建立批量训练样本
batch = np.random.permutation(trainData)[0:batchSize,:]
return batch

num = 100
for i in xrange(0,num):
batch = BuildBatch(batchSize)
loss = 0
for x in batch:
fc1,sig1,fc2 = forward(np.mat(x[0:2]))
loss += fc2 - np.mat(x[2]).T

W1,b1,W2,b2=backward(np.mat(batch[-1,0:2]),fc1,sig1,fc2,loss,W1,b1,W2,b2)

if i % (num/5) == 0:
print 'fc2:',fc2,'true value is :', x[2]

test1 = np.mat(np.random.randn(batchSize)*100-50).T
test2 = np.mat(np.random.randn(batchSize)*100-50).T
test=np.column_stack((test1,test2))
_,_,res=forward(test[0,:])
print res
print " true is : ", test[:,0]*3.0+test[:,1]*2.0


输出如下:

fc2: [[-27.41931935]] true value is : -431.800777524

fc2: [[-12.54421975]] true value is : -12.7732552551

fc2: [[-595.96721029]] true value is : -596.054284694

fc2: [[-227.83521179]] true value is : -227.82869663

fc2: [[-472.79423214]] true value is : -472.784873801

[[ 345.35202932]] true is : [[ 345.37223623]]
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
相关文章推荐