Variance and Expectation of a biased coin toss with different distributions

expected valueprobabilityvariance

I would love to verify if my thought process is correct regarding $E(X)$ and $Var(X)$:

Problem: A biased coin is toss with $P(\{H\}) = p$ and $P(\{T\}) = 1-p$. $X$ is a random variable.

If $H$ then $X \sim U(a,b)$ with expectation and variance $E(X_H)$ and $Var(X_H)$.

If $T$ then $X \sim Exp(\lambda)$ with expectation and variance $E(X_T)$ and $Var(X_T)$.

The expectation and variance of X is then:

\begin{equation}
\begin{aligned}
E(X) =& \quad p*E(X_H) + (1-p)*E(X_T) \\
\text{and} \\
Var(X)=& \quad E(X^2) – E(X)^2 \\
=& \quad p*E(X_H)^2 + (1-p)*E(X_T)^2 – E(X)^2 \\
\end{aligned}
\end{equation}

Best Answer

Your formulae are correct. Here is a proof.

If we note $Y$ the result of the coin draw. Heads is denoted by $Y=1$, tails $Y=0$, and let $U \sim U(a,b)$,$Z \sim \exp(\lambda)$. Thus formally :

$$X=\mathbf 1_{Y=1}U+\mathbf 1_{Y=0}Z$$

Thus :

$$E(X)=E(\mathbf 1_{Y=1}U+\mathbf 1_{Y=0}Z)$$

$\mathbf 1_{Y=1}$ and $\mathbf 1_{Y=0}$ are independent of $U$ and $Z$, the outcome of the draw will chose if you look at $U$ or $Z$ but the value of these r.v. are not impacted. Thus by linearity :

$$E(X)=E(\mathbf 1_{Y=1})E(U)+E(\mathbf 1_{Y=0})E(Z)$$

Since $E(\mathbf 1_{Y=1}) =P(Y=1)=p$, we arrive at your formula. For the variance we can extend this reasoning.

Related Question