Average – Sum of Averages vs Average of Sums

average

I have essentially a table of numbers — a time series of measurements. Each row in the table has 5 values for the 5 different categories, and a sum row for the total of all categories.

If I take the average of each column and sum the averages together, should it equal the average of the rows' sums (ignoring rounding error, of course)?

(I've got a case where the two values keep coming out different by about 30% and I'm wondering just how crazy I am.)

Update: See below — I was (slightly) crazy and had an error in my code.

Best Answer

The average of the entries in a column is the sum of the entries in that column, divided by the number of entries. The number of entries is the number of rows. So the sum of the averages is the sum of all the entries in the table, divided by the number of rows.

The average of the row sums is the sum of all entries in the table divided by the number of rows, so you should get the same number either way.