[Math] Variance of sample mean.

probabilitystatistics

Let $X_1, X_2, \ldots X_n$ be random variables with mean $\mu$ and variance $\sigma^2$.
We define the sample mean $\bar{X} = \sum_{i = 1}^n X_i$.

I am having trouble understanding why the variance of $\bar{X}$ is $\sigma^2 / n$.

Here is what I worked out thus far:

$$\begin{gather}
\sigma_\bar{X}^2 = E((\bar{X} – \mu)^2) = E(\bar{X}^2 – 2\bar{X}\mu + \mu^2) = E(\bar{X}^2) – 2\mu E(\bar{X}) + E(\mu^2) = E(\bar{X}^2) – \mu.
\end{gather}$$

However, I'm not sure what to do with the $E(\bar{X}^2)$.

Best Answer

You're making it too hard. Brief outline:

$$\operatorname{Var}(\bar X) = \frac{1}{n^2} \operatorname{Var}\left(\sum_i X_i \right) = \frac{1}{n^2}\sum_i \operatorname{Var}(X_i) = \frac{1}{n^2}(n\sigma^2).$$

Please leave a comment if you need detail to justify my $=$-signs.