[Math] Convergence in probability for binomial distribution

binomial distribution

If $m$ is the number of success in $n$ independent trials, in which the probability of success is $p$, then how to prove that $\frac mn$ converges in probability to $p$ as $n\to\infty$.

Best Answer

I will add some details and a direct proof. You can just skip the parts that are obvious for you. We can represent $m(n)$ by a random variable $Y_n\sim \mathrm {Bin}(n,p),$ that is $Y_n$ is the number of success in $n$ independent trials with success probability $p.$ In other words, $$Y_n=\sum\limits_{i=1}^n X_i,$$ where $X_1,\ldots,X_n\stackrel{iid}{\sim}\mathrm{Ber}(p),$ that is, $X_i$ is $1$ with probability $p$ and $0$ with probability $(1-p).$ From this, we can deduce $$\mathbb{E}\left[\frac{Y_n}{n}\right]=\frac{n}{n}\mathbb{E}[X_1]=p$$ and $$\mathrm{Var}\left(\frac{1}{n}Y_n\right)=\frac{1}{n^2}\mathrm{Var}\left(Y_n\right)=\frac{n}{n^2}\mathrm{Var}\left(X_1\right)=\frac{p(1-p)}{n}.$$ Now consider a random variable $X$ with finite second moment $\mathbb{E}[X^2]<\infty.$ Take some $\varepsilon>0.$ Then $$\mathbb{P}[|X-\mathbb{E}[X]|\geq \varepsilon]=\mathbb{E}[\mathbf{1}_{|X-\mathbb{E}[X]|\geq \varepsilon}]=\mathbb{E}[\mathbf{1}_{\frac{(X-\mathbb{E}[X])^2}{\varepsilon^2}\geq 1}]\leq\frac{\mathbb{E}[(X-\mathbb{E}[X])^2]}{\varepsilon^2}=\frac{\mathrm{Var}(X)}{\varepsilon^2}.$$ This is Chebyshev's inequality by the way.

Let us now go back to our original problem. To prove convergence in probability, we have to show that for all $\varepsilon>0,$ $$\mathbb{P}[|Y_n/n-p|\geq \varepsilon]\xrightarrow{n\to\infty}0.$$ By what I just showed, $$\mathbb{P}[|Y_n/n-p|\geq \varepsilon]\leq\frac{p(1-p)}{n\varepsilon^2}\xrightarrow{n\to\infty}0,$$ so we are done.

Note that all of this applies to all variables of the form $Y_n=\sum\limits_{i=1}^n X_i$ with $X_1,\ldots,X_n$ iid with $\mathbb{E}[X_i^2]<\infty,$ not just to binomial random variables. This is the weak law of large numbers, that states that the average of random variable with finite expectation will converge to the expectation in probability. We can even lift the assumption that they have finite second moment. This is a consequence of Etemadi's Law of large numbers. However, the proof is a bit lengthy so I will leave you with the classical weak law of large numbers. All in all, you have

  • A detailed answer to your question
  • A proof of Chebyshev's inequality
  • A proof of the weak law of large numbers