[Math] How to take the average for percent error

percentages

I have roughly 39 data points that are values for % error. I have been researching and found that it is not correct to simply take the average of the % errors. Is this correct? If so, what is the correct way to take the average for % error values? All of the data values are weighted the same. Thanks!

Best Answer

You certainly can average the percent error values. That is a well defined operation. As Dilbert says, you can multiply them, too. Whether or not it expresses what you want can be very subtle. You are probably remembering the fact that averaging the percentage errors will not give the same result as dividing the average error by the average true value. If you explain carefully what you want to express, it will lead to the correct way to compute it.

Related Question