epoch what is code example

Example 1: Epoch vs Batch Size vs Iterations

One Epoch is when an ENTIRE dataset is passed forward and backward 
through the neural network only ONCE.

Batch Size is the Total number of training examples present in a 
single batch.

Iterations is the number of batches needed to complete one epoch.

Example 2: epoch

Epoch timestamp is a unit of measurement for time. It is the number of
seconds elapsed since the countdown has started. 

Beginning epochs per system:
macOS - January 1, 1904
Windows - January 1, 1601
Unix - January 1, 1970

Example 3: epochs

One Epoch is completed when a complete dataset is cycled forward and backward through the neural network or you can say your neural network has watched the entire dataset for once.