Batch Gradient Descent(python)
2017-07-21 20:48
232 查看
import numpy as np import tensorflow as tf def GradientDescent(x,y,theta): m, n = x.shape # m is #training example,while n is #feature for j in range(n): #learning rate:0.03 theta[j] = theta[j] + 0.03/m * np.sum(([(y[i] -np.matmul(x[i,:],theta))*x[i,j] for i in range(m)])) return theta x = np.array([[1],[2],[3],[4],[5],[6]]) y = np.array([5,7,9,11,13,15])
#the stop conditon epison = 0.01
#add x0=1 to the data x1 = np.hstack((np.ones((6,1)), x)) theta = np.zeros((2,1)) # m,n = x1.shape print(x1.shape) while(True): theta = GradientDescent(x1, y, theta) prediction = np.matmul(x1, theta) loss = np.sum((prediction.T - y)**2) if loss < epison: break print('prediction=',prediction.T) print('y=',y) print('loss=',loss)
相关文章推荐
- 随机梯度下降(Stochastic gradient descent)和 批量梯度下降(Batch gradient descent )的公式对比、实现对比
- Mini-Batch Gradient Descent
- Batch Gradient Descent
- batch gradient descent(批量梯度下降) 和 stochastic gradient descent(随机梯度下降)
- batch gradient descent(批量梯度下降) 和 stochastic gradient descent(随机梯度下降) C++版
- gradient descent vs (mini-batch) stochastic gradient descent
- 随机梯度下降(Stochastic gradient descent)和 批量梯度下降(Batch gradient descent )
- 【机器学习学习过程中的笔记1——Stochastic gradient descent 和 Batch gradient descent 】
- 随机梯度下降(Stochastic gradient descent)和 批量梯度下降(Batch gradient descent )的公式对比、实现对比
- 随机梯度下降(Stochastic gradient descent)和 批量梯度下降(Batch gradient descent )的公式对比、实现对比
- 随机梯度下降(Stochastic gradient descent)和 批量梯度下降(Batch gradient descent )
- 随机梯度下降(Stochastic gradient descent)和 批量梯度下降(Batch gradient descent )的公式对比、实现对比
- [Machine Learning]随机梯度下降(Stochastic gradient descent)和 批量梯度下降(Batch gradient descent )的对比
- 随机梯度下降(Stochastic gradient descent)和 批量梯度下降(Batch gradient descent )的公式对比、实现对比
- batch&stochasic gradient descent
- 随机梯度下降(Stochastic gradient descent)和 批量梯度下降(Batch gradient descent )的公式对比、实现对比
- [Machine Learning] 梯度下降(BGD)、随机梯度下降(SGD)、Mini-batch Gradient Descent、带Mini-batch的SGD
- Batch Gradient Descent and Stochastic Gradient Descent
- 梯度下降(BGD)、随机梯度下降(SGD)、Mini-batch Gradient Descent、带Mini-batch的SGD
- FITTING A MODEL VIA CLOSED-FORM EQUATIONS VS. GRADIENT DESCENT VS STOCHASTIC GRADIENT DESCENT VS MINI-BATCH LEARNING. WHAT IS THE DIFFERENCE?