[Math] Sum of independent normal random variables

normal distributionprobabilityprobability distributions

Suppose $X_1, . . . , X_n$ are independent normal random variables with the same mean $μ$ and standard deviation $\sigma$. Show that (1) $S = X_1 +···+X_n$ is also a normal random variable and (2) find its mean and standard deviation.

Since the variables are independent and have the same mean and standard deviation i.e. i.i.d. we can use the Normal Approximation Based on the Central Limit Theorem.

(1) I am having a difficulty showing how S is a normal distribution although the theorem states that it will be.

(2) Using the theorem, the mean of $S$ is $n\mu$ and the variance of $S$ is $n\sigma^2$.

Best Answer

If $X_1, X_2, \dots, X_n$ are independent and identically distributed as $\mathsf{Norm}(\mu, \sigma),$ then $S = \sum_{i=1}^n X_i \sim \mathsf{Norm}(n\mu, \sqrt{n\sigma^2})$ and $\bar X_n = S/n \sim \mathsf{Norm}(\mu, \sigma/\sqrt{n}).$

The statements about $E(S), Var(S), E(\bar X_n),$ and $Var(\bar X_n)$ follow readily from the definitions of expectation and variance. That $S$ and $\bar X_n$ are normal can be shown using moment generating functions. These relationships are not technically part of the Central Limit Theorem (CLT) but they are usually stated or proved when the CLT is discussed.

The CLT is a limit theeorem; it states that if the distribution of the $X_i$ has finite variance $Var(X_i) = \sigma^2$ (but not necessarily normal), then $Z_n = \frac{\bar X_n = \mu}{\sigma/{n}}$ converges in distribution to $\mathsf{Norm}(0,1).$

Related Question