Solved – Mean absolute error vs sum absolute error

MATLABneural networks

I'm using the Matlab Neural Network toolbox for a regression problem (twelve inputs, one target). My target has a high dynamic range and is strongly non-gaussian. So far I've solved this by a log-transformation, but as an alternative I'm exploring if a different error function than the standard mean square error would be of use.

The Matlab Neural Network toolbox comes with four built-in "performance" functions:

>> help nnperformance
  Neural Network Toolbox Performance Functions.

    mae - Mean absolute error performance function.
    mse - Mean squared error performance function.
    sae - Sum absolute error performance function.
    sse - Sum squared error performance function.

In my understanding, such a performance function is a cost function. But as a cost function is simply something to be minimised, what is the difference between Mean absolute error and Sum absolute error in practice? Similarly, what is the difference between Mean squared error and Sum squared error? Shouldn't those be the same?

The documentation for the different functions is of no use. In fact, the documentation for mse and sse is identical apart except for one word.

Best Answer

You are correct, the minimizers are the same.

However, looking at mean values can be easier to interpret, while the sum is related to the complete log likelihood and can thus serve for comparison with other models that use a non uniform prior on your parameters. Also, you can use it for cross model measures such as AIC or BIC.

Related Question