您的位置:首页 > 其它

Mini-Batch Gradient Descent

2017-12-12 01:37 387 查看

Mini-Batch Gradient Descent

1. What is Mini-Batch Gradient Descent?

Mini-Batch Gradient Descent is an algorithm between the Batch Gradient Descent and Stochastic Gradient Descent. Concretly, this use some(not one or all) examples(M) for each iteration.

2. Compute Effort

The compute time of this algorithm depends on the examples. It not stable, but the worst case is like Batch Gradient Descent: O(N2)

The table below shows the different among these there Gradient Descent

Batch Gradient DescentMini-Batch Gradient DescentStochastic Gradient Descent
use 1 example in each iterationuse some examplesuse all example in each iteration
relative compute loosesomewhat in betweenrelative compute intensive

3. Gradient Descent Formula

For all θi

∂Jθ∂θi=1m∑i=1M[hθ(xi)−yi]⋅(xi)

E.g.,

two parameters θ0,θ1 –> hθ(x)=θ0+θ1x1

For i = 0 :

∂Jθ∂θ0=1m∑i=1M[hθ(xi)−yi]⋅(x0)

For i = 1:

∂Jθ∂θ1=1m∑i=1M[hθ(xi)−yi]⋅(x1)

Note that the datasets need to be shuffled before iteration.
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
相关文章推荐