Delta rule
In machine learning, the Delta rule is a gradient descent learning rule for updating the weights of the inputs to artificial neurons in a single-layer neural network. It is a special case of the more general backpropagation algorithm. For a neuron
, the delta rule for
th weight
It holds that
The delta rule is commonly stated in simplified form for a neuron with a linear activation function as
While the delta rule is similar to the perceptron's update rule, the derivation is different. The perceptron uses the Heaviside step function as the activation function
, and that means that