Solved – Mean absolute deviation

regression

Wikipedia states:

The mean absolute error (MAE) is a common measure of forecast error in time
series analysis, where the terms "mean absolute deviation" is
sometimes used in confusion with the more standard definition of mean
absolute deviation. The same confusion exists more generally.

What does that mean? What exactly is the confusion?

Also, why is MAE is used in time series analysis specifically? (as opposed to more general measures of error such as MSE)?

Best Answer

One of the reasons MAE is used in time series or forecasting is that non-scientists find it easy to understand. So if you tell your client the MAE is 1.5 units, for example, he/she can interpret that as the average amount that the forecast is in error (in absolute units). But if you tell them the MSE you may well get a blank look because it has no such interpretation.

I'm not sure what causes the confusion between MAE and mean absolute deviation, but I'd attribute it to a lack of clear definitions or explanations in the specific context where it is used.