Sum of i.i.d. normal variables vs constant multiplied the i.i.d. normal random variable.

normal distributionprobabilityprobability distributionsstatistics

https://online.stat.psu.edu/stat414/lesson/26/26.1

Here, the variance of the sum of n normal random variables is shown as the sum of the product of the coefficients of the n random variables with the variances of the random variables.

Eg, if $Y=\sum_{i=1}^n c_iX_i$, and $X_i \sim N(μ_i,σ_i^2)$, then $Y \sim N(\sum_{i=1}^n c_iμ_i,\sum_{i=1}^n c_i^2σ_i^2)$

This is proven using the product of the moment generating functions of normal random variables.

However, several sources show that a constant multiplied by a random variable results in a distribution where the variance is the product of the original variance multiplied by the multiplier, squared.

Multiplication of a random variable with constant

I am having trouble understanding how this could be. Why is the distribution not

$Y=nX=\sum_{i=1}^n X\sim N(\sum_{i=1}^n 1*μ,\sum_{i=1}^n 1^2σ^2)=N(nμ,nσ^2)$

but instead

$Y \sim N(nμ,n^2σ^2)$

?

Best Answer

When $(X_i)$ is a collection of independent and identically distributed random variables, then there is a distinction between $nX_1$ and $\sum_{i=1}^n X_i$.

$nX_1$ is a single random variable multiplied by a constant.

$$\begin{align}\mathsf{Var}(nX_1) &= \mathsf E(n^2X_1^2)-\mathsf E^2(nX_1)\\&=n^2(\mathsf E(X_1^2)-\mathsf E^2(X_1))\\& = n^2\mathsf{Var}(X_1)\end{align}$$

$\sum_{i=1}^n X_i$ is the sum of several different random variables (although iid).

$$\begin{align}\mathsf{Var}(\sum_{i=1}^nX_i) &=\sum_{i=1}^n\mathsf {Var}(X_i) +2\sum_{0\leqslant i<j\leqslant n}\mathsf{Cov}(X_i,X_j) \\ & = n\mathsf{Var}(X_1) + 0\end{align}$$

Related Question