[Math] Mean Squared Error (MSE) of parameter $p$ in binomial distribution: intuition

binomial distributionestimationmean square errorparameter estimationprobability

Let $X \sim Bin(n, p)$ and let

$$\hat p = \frac{X}{n} $$ be an estimator of the probability $(p)$ of success in this binomial distribution, where $X$ is the number of successes and $n$ is the number of trials.

My question is, why does $\text{MSE}(\hat p)\rightarrow 0$ as $p \rightarrow0 \, \text{ or }\, 1$?

Best Answer

Since $X\sim Bin(n,p)$, then $\mathbb E[X]=np$ and $\mathbb E[X^2]=np((n-1)p+1)$, where $\mathbb E[X^2]$ can be deduced from the well known fact $Var(X)=np(1-p)=\mathbb E[X^2]-\mathbb E[X]^2$. Hence \begin{align} \operatorname {MSE} \left(\hat p\right)&=\mathbb {E} \left[\left(\hat p -p \right)^{2}\right]=\mathbb E\left[\left(\frac{X}{n}-p\right)^2\right]=\mathbb E\left[\frac{X^2}{n^2}-\frac{2pX}n+p^2\right]\\[0.3cm]&=\frac1{n^2}\mathbb E\left[X^2\right]-\frac{2p}{n}\mathbb E[X]+p^2\\[0.3cm]&=\frac{np}{n^2}\left((n-1)p+1\right)-\frac{2}{n}np^2+p^2\\[0.3cm]&=\frac{p(1-p)}{n}\longrightarrow0, \qquad \text{ for}\,\, p\to\{ 0,1\}\end{align}

Intuitively, if $p\to 0$ or $1$, then the uncertainty in your experiment tends to vanish: your sample will either have only successes $(p=1)$ or only failures $(p=0)$. Hence, your estimation will have great accuracy. To gain more understanding, consider the case where uncertainty of the outcome of the experiment is maximized, i.e. when $p=1/2$. In this case $p(1-p)$ is maximum hence also the MSE.

Related Question