[Math] Proof of Weak Law of Large Numbers using Chebyshev’s Inequality

law-of-large-numbersprobabilityprobability theorystatistics

In a university statistics course, we were presented with a "proof" of the Weak Law of Large Numbers (as it applies to population samples) based on Chebyshev's inequality. The steps the professor took are nearly identical to those linked in the following Wikipedia article:
https://en.wikipedia.org/wiki/Law_of_large_numbers#Proof_using_Chebyshev.27s_inequality

The main difference in the proofs appeared when my professor began with Chebyshev's inequality:

$P(|\bar{X_n}-\mu|< k\sigma)\geq1-\frac{1}{k^2}$

Next, she replaced $\sigma$ with $\frac{\sigma}{\sqrt{n}}$ and $\frac{k\sigma}{\sqrt{n}}$ with $c$, so

$P(\mu-c<\bar{X_n}< \mu+c)\geq1-\frac{\sigma^2}{nc^2}$

This aligns identically with the proof on Wikipedia. We examine this as the sample number (n) approaches infinity and claim the probability approaches 1 that the sample average ($\bar{X_n}$) approaches the expected value ($\mu$).

Here are my problems:

1: As n approaches infinity, c approaches 0, so
$P(\mu-c<\bar{X_n}< \mu+c)=P(\mu<\bar{X_n}< \mu)$ This doesn't make sense, since $\mu$ is not less than $\mu$.

Given this question, does this proof have any merit? Or am I perhaps misunderstanding something algebraically or conceptually? Thanks for your time!

Best Answer

In the proof, $c$ is held constant as $n\to\infty$; it doesn't tend to zero. The argument shows that: $$ \text{for every $c>0$, }\lim_{n\to\infty}P(\mu-c<\bar X<\mu+c)=1\tag1 $$ By taking the complement of the event $\{\mu-c<\bar X<\mu+c\}$ in (1), we get $$ \text{for every $c>0$, }\lim_{n\to\infty}P(|\bar X-\mu|>c)=0\tag2 $$ which is precisely what the weak law of large numbers states. (So the proof has merit!)

Related Question