I have a problem with calculating the variance of an exponential distribution. In my formulary there are these formulas for exponential distributions:
$E(X)=\frac{1}{\lambda}$
$V(X)=\frac{1}{\lambda^2}$
Where $E(X)$ is the expected value and $V(X)$ the variance.
I have a distribution function on the form $F_X(x)=C_1(1-e^{-\lambda_1x})+C_2(1-e^{-\lambda_2x})$
where the $C$s and $\lambda$s are constants.
I calculated $E(X)$ in the following way:
$E(X)=C_1*\frac{1}{\lambda_1}+C_2*\frac{1}{\lambda_2}$
But when I try to calculate the variance using the formula above for V(X) and the same method as I used when calculating $E(X)$ I don't get the right answer.
I get the right answer if I use $V(X)=E(X^2)-(E(X))^2$,
but both should generate the same answer since $E(X^2)=\frac{2}{\lambda^2}$ (from partial integration) which gives $V(X)=\frac{2}{\lambda^2}-(\frac{1}{\lambda})^2=\frac{1}{\lambda^2}$
Can someone explain this please?
Best Answer
The variance of the sum of two variables must be calculated with a term accounting for the covariance of those two variables.
$$ Var(aX + bY) = a^2Var(X) + b^2Var(Y) + 2ab Cov(X,Y) $$
Note that the coefficients on the variables are also squared in the first two terms of that equation. If X and Y are fully independent then the third term is zero of course.