What is Epoch, Batch Size and Iterations?

One Epoch is when an ENTIRE dataset is passed forward and backward through the neural network only ONCE.

Since, one epoch is too big to feed at once it is divided into several smaller batches of size batch_size.

Iterations is the number of batches needed to complete one epoch.

For example, a dataset of 1000 samples can be divided into batches of 200 and it will take 5 iterations to complete 1 epoch.

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.