您的位置:首页 > 理论基础 > 计算机网络

Tensorflow简单神经网络实现

2018-01-30 15:02 579 查看
import tensorflow as tf
from numpy.random import RandomState
#定义测试集大小
batch_size = 8
w1 = tf.Variable(tf.random_normal([2,3],stddev=1,seed=1))
w2 = tf.Variable(tf.random_normal([3,1],stddev=1,seed=1))
#输入
x = tf.placeholder(tf.float32,shape=(None,2),name='x-input')
#输出
y_ = tf.placeholder(tf.float32,shape=(None,1),name='y-input')
#定义前向传播过程
a = tf.matmul(x,w1)
y = tf.matmul(a,w2)
#定义损失函数和反向传播算法
cross_entropy = -tf.reduce_mean(y_*tf.log(tf.clip_by_value(y,1e-10,1.0)))
train_step = tf.train.AdadeltaOptimizer(0.001).minimize(cross_entropy)
#随机生成一个模拟数据集
rdm = RandomState(1)
X = rdm.rand(128,2) #data_size=128
Y = [[int(x1+x2<1)] for (x1,x2) in X]
sess = tf.Session()
init_op = tf.initialize_all_variables()
sess.run(init_op)
print(sess.run(w1))
print(sess.run(w2))
#设定训练轮数
STEPS = 5000
for i in range(STEPS):
#每次选取batch_size个样本进行训练
start = (i * batch_size) % 128
end = min(start+batch_size,128)
sess.run(train_step, feed_dict={x: X[start:end], y_: Y[start:end]})
if i % 1000 ==0:
totoal_cross_entropy = sess.run(cross_entropy,feed_dict={x:X,y_:Y})
print("After %d training steps,cross entropy on all data is %g" % (i, totoal_cross_entropy))
print(sess.run(w1))
print(sess.run(w2))
print(sess.run(w1))
print(sess.run(w2))
sess.close()
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: