Solved – What are the differences between ‘epoch’, ‘batch’, and ‘minibatch’

machine learningterminology

As far as I know, when adopting Stochastic Gradient Descent as learning algorithm,
someone use 'epoch' for full dataset, and 'batch' for data used in a single update step, while another use 'batch' and 'minibatch' respectively, and the others use 'epoch' and 'minibatch'. This brings much confusion while discussing.

So what is the correct saying? Or they are just dialects which are all acceptable?

Best Answer

  • Epoch means one pass over the full training set
  • Batch means that you use all your data to compute the gradient during one iteration.
  • Mini-batch means you only take a subset of all your data during one iteration.