I have a monthly average for a value and a standard deviation corresponding to that average. I am now computing the annual average as the sum of monthly averages, how can I represent the standard deviation for the summed average ?
For example considering output from a wind farm:
Month MWh StdDev
January 927 333
February 1234 250
March 1032 301
April 876 204
May 865 165
June 750 263
July 780 280
August 690 98
September 730 76
October 821 240
November 803 178
December 850 250
We can say that in the average year the wind farm produces 10,358 MWh, but what is the standard deviation corresponding to this figure ?
Best Answer
Short answer: You average the variances; then you can take square root to get the average standard deviation.
Example
And then the average standard deviation is
sqrt(53,964) = 232
From Sum of normally distributed random variables:
And from Wolfram Alpha's Normal Sum Distribution:
For your data:
10,358 MWh
647,564
804.71 ( sqrt(647564) )
So to answer your question:
You sum them quadratically:
Conceptually you sum the variances, then take the square root to get the standard deviation.
Because i was curious, i wanted to know the average monthly mean power, and its standard deviation. Through induction, we need 12 normal distributions which:
10,358
647,564
That would be 12 average monthly distributions of:
10,358/12 = 863.16
647,564/12 = 53,963.6
sqrt(53963.6) = 232.3
We can check our monthly average distributions by adding them up 12 times, to see that they equal the yearly distribution:
863.16*12 = 10358 = 10,358
(correct)53963.6*12 = 647564 = 647,564
(correct)Edit: I moved the short, to the point, answer up top. Because i needed to do this again today, but wanted to double-check that i average the variances.