Solved – the point of Root Mean Absolute Error, RMAE, when evaluating forecasting errors

errorforecastingmaerms

RMAE is defined as the square root of the Mean Absolute Error (MAE). Presumably this is by analogy to Root Mean Square Error (RMSE) being defined as the square root of Mean Square Error (MSE).

But what is the purpose of it, and why would anyone use it? Does it have some particularly useful interpretation or properties in certain circumstances?

It seems to me that, as a monotonic transformation of MAE, it only informs us (e.g. "smaller RMAE indicates smaller forecast errors") what we could have learned from looking at the MAE anyway.

Using the square root function to transform MSE to RMSE has certain benefits, e.g. it is now on the same scale as the original data and may have more meaningful units (e.g. dollars rather than dollars squared). But I can't see a corresponding benefit to transforming MAE to RMAE, since MAE was already on the scale of the original data (and I wouldn't accept payment in root-dollars either).

RMAE doesn't seem to be a common forecast error metric — I only came across it recently in some lecture slides, and it has few search engine hits. Nevertheless it does appear to be A Thing. But does it have a purpose and does anybody ever use it, other than as yet another entry in lists of error metrics? (I hypothesise that some writers, having put down MSE, RMSE and MAE as the three main methods, feel the need to complete the quartet by analogy.)

Best Answer

I think it seems like a misunderstanding, AFAIK rMAE is "relative Mean Absolute Error" not "root Mean Absolute Error" and as a result it has no unit (e.g. dollars)

And it might be useful for comparison of classifiers which were tested on completely different datasets (with different units etc.)

See this link for more information: http://www.gepsoft.com/gxpt4kb/Chapter09/Section1/SS03/SSS5.htm