The process of updating weights is one training iteration involving **backpropagation**. The latter can be broken down into 4 distinct steps:

- Forward pass

In this first training step, the intention is to extract weight from the first pass of the training data into the neural network.

- Loss function

The training loss (precision) is calculated from a loss function, e.g mean square error. On the first iteration, the loss will be high. Then, iteration after iteration, the neural network learns and the training loss (difference) between the prediction label and training label decreases.

- Backward pass

To achieve a minimum loss, the weights need to be adjusted by taking a derivative of the loss with respect to the weights.

- Weight update

The weights of each layer are then updated accordingly with the calculated weights from the training iterations.