As far as I know, when adopting Stochastic Gradient Descent as learning algorithm,
someone use 'epoch' for full dataset, and 'batch' for data used in a single update step, while another use 'batch' and 'minibatch' respectively, and the others use 'epoch' and 'minibatch'. This brings much confusion while discussing.
So what is the correct saying? Or they are just dialects which are all acceptable?
Best Answer