Mini-Batch Gradient Descent
2017-12-12 01:37
387 查看
Mini-Batch Gradient Descent
1. What is Mini-Batch Gradient Descent?
Mini-Batch Gradient Descent is an algorithm between the Batch Gradient Descent and Stochastic Gradient Descent. Concretly, this use some(not one or all) examples(M) for each iteration.2. Compute Effort
The compute time of this algorithm depends on the examples. It not stable, but the worst case is like Batch Gradient Descent: O(N2)The table below shows the different among these there Gradient Descent
Batch Gradient Descent | Mini-Batch Gradient Descent | Stochastic Gradient Descent |
---|---|---|
use 1 example in each iteration | use some examples | use all example in each iteration |
relative compute loose | somewhat in between | relative compute intensive |
3. Gradient Descent Formula
For all θi∂Jθ∂θi=1m∑i=1M[hθ(xi)−yi]⋅(xi)
E.g.,
two parameters θ0,θ1 –> hθ(x)=θ0+θ1x1
For i = 0 :
∂Jθ∂θ0=1m∑i=1M[hθ(xi)−yi]⋅(x0)
For i = 1:
∂Jθ∂θ1=1m∑i=1M[hθ(xi)−yi]⋅(x1)
Note that the datasets need to be shuffled before iteration.
相关文章推荐
- 梯度下降(BGD)、随机梯度下降(SGD)、Mini-batch Gradient Descent、带Mini-batch的SGD
- [Machine Learning] 梯度下降(BGD)、随机梯度下降(SGD)、Mini-batch Gradient Descent、带Mini-batch的SGD
- 梯度下降(BGD)、随机梯度下降(SGD)、Mini-batch Gradient Descent、带Mini-batch的SGD
- Mini-Batch Gradient Descent介绍以及如何决定Batch Size
- BGD(Batch Gradient Descent), SGD (Stochastic Gradient Descent), MBGD (Mini-Batch Gradient Descent)
- gradient descent vs (mini-batch) stochastic gradient descent
- 深度学习—加快梯度下降收敛速度(一):mini-batch、Stochastic gradient descent
- 梯度下降(BGD)、随机梯度下降(SGD)、Mini-batch Gradient Descent、带Mini-batch的SGD
- FITTING A MODEL VIA CLOSED-FORM EQUATIONS VS. GRADIENT DESCENT VS STOCHASTIC GRADIENT DESCENT VS MINI-BATCH LEARNING. WHAT IS THE DIFFERENCE?
- 随机梯度下降(Stochastic gradient descent)和 批量梯度下降(Batch gradient descent )的公式对比、实现对比
- 随机梯度下降(Stochastic gradient descent)和 批量梯度下降(Batch gradient descent )的公式对比、实现对比
- Batch Gradient Descent(python)
- Batch Gradient Descent
- 随机梯度下降(Stochastic gradient descent)和 批量梯度下降(Batch gradient descent )的公式对比、实现对比
- batch gradient descent和stochastic/incremental gradient descent
- 随机梯度下降(Stochastic gradient descent)和 批量梯度下降(Batch gradient descent )的公式对比、实现对比[转]
- Stochastic gradient descent与Batch gradient descent
- 随机梯度下降法(Stochastic Gradient Descent)和批量梯度下降法(Batch Gradient Descent )总结
- 随机梯度下降(Stochastic gradient descent)和 批量梯度下降(Batch gradient descent )的公式对比、实现对比
- Mini-batch gradient