I can't seem to think of a reason why training error increases in learning curves as the number of samples increases. Would someone please explain?
Solved – Why does training error increase in learning curves
machine learningmodel-evaluationoverfitting
Best Answer
Because it is harder for the model (with a fixed complexity) to overfit to a bigger training set.