Solved – Learning curve vs training (loss) curve

loss-functionsmachine learningtraining error

In machine learning, there are two commonly used plots to identify overfitting.

One is the learning curve, which plots the training + test error (y-axis) over the training set size (x-axis).

The other is the training (loss/error) curve, which plots the training + test error (y-axis) over the number of iterations/epochs of one model (x-axis).

Why do we need both curves? Specifically, what does a learning curve tell us over a training curve? (If we just want to detect if a model overfits, the training curve seems much more efficient to plot.)

Best Answer

The learning curve gives you an idea of how the model benefits from being incrementally fed more and more data observations, therefore focusing on inputs external to the model, thereby quantifying the marginal benefit of each new data point.

The training curve gives you an idea of how the model benefits from having its bias-variance trade-off managed while cycling its algorithm back from start to finish repeatedly, therefore, focusing on processes or parameter calibration inputs internal to the model, likely while leaving the number of data observations unchanged.