您的位置:首页 > 其它

Brief overview of backward and forward

2015-09-03 21:04 267 查看
Let’s say we only feed in one data point.

out = model:forward(xi) computes fw(xi) where fw is our model with its current parameters w, and stores the result in out.

loss = criterion:forward(out, yi) computes the loss ℓ(fw(xi),yi) with respect to the true value yi.

dl_dout = criterion:backward(out, yi) computes ∂ℓ(...)∂fw(xi).

model:backward(xi, dl_dout) computes ∂ℓ(...)∂w and stores this gradient in a place we have a reference to, usually called gradParameters in our code.
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  torch