[Math] How to understand the variance formula

probabilityprobability distributionsprobability theorystatistics

enter image description here

How is the variance of Bernoulli distribution derived from the variance definition?

Best Answer

PMF of the Bernoulli distribution is $$ p(x)=p^x(1-p)^{1-x}\qquad;\qquad\text{for}\ x\in\{0,1\}, $$ and the $n$-moment of a discrete random variable is $$ \text{E}[X^n]=\sum_{x\,\in\,\Omega} x^np(x). $$ Let $X$ be a random variable that follows a Bernoulli distribution, then \begin{align} \text{E}[X]&=\sum_{x\in\{0,1\}} x\ p^x(1-p)^{1-x}\\ &=0\cdot p^0(1-p)^{1-0}+1\cdot p^1(1-p)^{1-1}\\ &=0+p\\ &=p \end{align} and \begin{align} \text{E}[X^2]&=\sum_{x\in\{0,1\}} x^2\ p^x(1-p)^{1-x}\\ &=0^2\cdot p^0(1-p)^{1-0}+1^2\cdot p^1(1-p)^{1-1}\\ &=0+p\\ &=p. \end{align} Thus \begin{align} \text{Var}[X]&=\text{E}[X^2]-\left(\text{E}[X]\right)^2\\ &=p-p^2\\ &=\color{blue}{p(1-p)}, \end{align} or \begin{align} \text{Var}[X]&=\text{E}\left[\left(X-\text{E}[X]\right)^2\right]\\ &=\text{E}\left[\left(X-p\right)^2\right]\\ &=\sum_{x\in\{0,1\}} (x-p)^2\ p^x(1-p)^{1-x}\\ &=(0-p)^2\ p^0(1-p)^{1-0}+(1-p)^2\ p^1(1-p)^{1-1}\\ &=p^2(1-p)+p(1-p)^2\\ &=(1-p)(p^2+p(1-p)\\ &=\color{blue}{p(1-p)}. \end{align}

Related Question