[Math] Looking for a proof of : variance of sum is the sum of variances.

random variablesstatistics

For independent random variables X and Y, the variance of their sum or
difference is the sum of their variances:

I can see why above should be true : if $x_1<X<x_1$
and $y_1 <Y < y_2$, then clearly $x_1+y_1 <X+Y<x_2+y_2$. But proving this seems a bit hard. Here is my attempt :

$\mathbb {var(X) = \sum[x_i – mean(X)]^2p_i}$
$\mathbb {var(Y) = \sum[y_i – mean(Y)]^2p_i}$,
then I guess the variance of sum should be :
$\mathbb {var(X+Y) = \sum[(x_i+y_i) – mean(X+Y)]^2\color{red}{p_{??}}}$

There is no way something like (a+b+m)^2 simplifies to (a+m)^2 + (b+m)^2. I'm kinda stuck here, any help ?

Best Answer

By subtracting off the means, it is sufficient to consider the case when $X$ and $Y$ are centered (i.e., $\mathbb EX = \mathbb EY=0$). Then $$ \text{Var}(X\pm Y)=\mathbb E(X\pm Y)^2=\mathbb E X^2\pm 2\mathbb E(XY)+\mathbb EY^2. $$ Now since $X$ and $Y$ are independent and centered, $\mathbb E(XY)=(\mathbb EX)(\mathbb EY)=0$ and therefore $$ \text{Var}(X\pm Y)=\text{Var}( X)+\text{Var} (Y). $$

Related Question