[Math] Variance of a random variable representing the sum of two dice

dicestatistics

$\newcommand{\Var}{\operatorname{Var}}$

The formula for the variance of the sum of two independent random variables is given $$ \Var (X +X) = \Var(2X) = 2^2\Var(X)$$

How then, does this happen:

Rolling one dice, results in a variance of $\frac{35}{12}$. Rolling two dice, should give a variance of $2^2\Var(\text{one die}) = 4 \times \frac{35}{12} \approx 11.67$. Instead, my Excel spreadsheet sample (and other sources) are giving me 5.83, which can be seen is equal to only $2 \times \Var(X)$.

What am I doing wrong?

Best Answer

$\newcommand{\Var}{\operatorname{Var}}$

The formula you give is not for two independent random variables. It's for random variables that are as far from independent as you can get. If $X,Y$ are independent, then you have $\Var(X+Y)=\Var(X)+\Var(Y)$. If, in addition, $X$ and $Y$ both have the same distribution, then this is equal to $2\Var(X)$. It is also the case that, as you say, $\Var(X+X)=4\Var(X)$. But that involves random variables that are nowhere near independent.