Solved – proving consistency for a sequence of Bernoulli random variables

asymptoticsbernoulli-distributionmathematical-statisticsself-study

Let $X_1, … , X_n$ be i.i.d Bernoulli random variables with unknown parameter $p\in\left(0,1\right)$.

Let $\bar{X_n}$ be the sample average of the $X_i$'s.

Prove that $\bar{X_n}(1 – \bar{X_n})$ is a consistent estimator of p(1-p).

In order to prove this I was thinking to use the Chebyshev Inequality somehow, but I'm not sure how to go about it.

Best Answer

One way is to use the continuous mapping theorem:

if $T_n$ is consistent for $θ$ and $g(·)$ is a real-valued function continuous at point $θ$, then $g(T_n)$ will be consistent for $g(θ)$.

You now only need to prove that $\bar X_n$ is a consistent estimator for $p$, and it will imply that $\bar X_n\left(1-\bar X_n\right)$ is a consistent estimator for $p\left(1-p\right)$, because $g\left(x\right)=x\left(1-x\right)$ is a real-valued continuous function.


To prove that $\bar X_n$ is a consistent estimator, you can use the theorem that states that:

an estimator $\hat \theta$ is a consistent estimator for $\theta$ if: $$\lim_{n\to\infty}MSE\left(\hat\theta\right)=0$$

In your case you can quite easily show it using the equivalence: $$MSE\left(\hat\theta\right)=Var\left(\hat\theta\right)+\underbrace{bias^2 \left(\hat\theta,\theta \right)}_{=0\text{ in this case}}$$

Related Question