Solved – Deep Learning: What happens after each epoch

deep learningdeep-belief-networksmachine learningneural networks

I am trying to understand batch size and epochs, and I found this very helpful. Each epoch is all of the data, lets say 10,000 rows, and the number of batches is the number of groups the epoch is split into. Then, a minimizer function is applied between each batch (lets say each batch is 100 rows) and the batch's y-values. I understand that the point is to get good weights and biases, so all of the x-values are pushed through the network (which initially has random weights and biases) and the weights and biases are "optimized".

What happens when one epoch is finished? Is the resulting network, which has
been setup to accept the x-value and predict the y-value, used as a starting point for the next epoch?

Best Answer

Is the resulting network, which has been setup to accept the x-value and predict the y-value, used as a starting point for the next epoch?

Yes!

Typically, you run several epochs before stopping. There are many ways to decide when to stop. Early stopping is one common method: see https://en.wikipedia.org/wiki/Early_stopping