[Math] The variance in the average of a set of normally distributed random variables

normal distributionprobabilityrandom

I have a set of $M$ normally distributed random variables, $r_i$, each with an associated mean $u_i$, but the same variance $\sigma^2$. What is the variance of the average of these $M$ random variables: $\frac{\sum_{i=1}^{M} u_i}{M}$? How does the variance change as $M$ increases? What if the $M$ variables have a uniform, rather than a normal distribution, over some interval $[A, B]$?

Best Answer

Assuming the M variables are independent the average has a normal distrbution with mean equal to the average of the u$_i$s as you guessed and variance σ$^2$/M. The mean and the variance will be the same for a uniform but the average will have its distirbution on[A, B]. but if you define all the uniforms to be over the same interval [A, B] they will be IID and the distribution when the mean is appropriately normalized will converge to a normal by the central limit theorem.

Related Question