Stochastic gradient descent

From Wikipedia Quality
Jump to: navigation, search

Stochastic gradient descent (often shortened to SGD), also known as incremental gradient descent, is an iterative method for optimizing a differentiable objective function, a stochastic approximation of gradient descent optimization.