Backpropagation

Definition
Backpropagation (an abbreviation for backward propagation of errors) is

"a common method of training artificial neural networks used in conjunction with an optimization method such as gradient descent. The method calculates the gradient of a loss function with respect to all the weights in the network."