[Math] Central Limit Theorem implies Law of Large Numbers

probability theory

Let $X_i$ be iid random variables and let $\overline{X}_n=(X_1+\cdots+X_n)/n$. If $EX_i=\mu$ and $\operatorname{Var}X_i = \sigma^2$ then the central limit theorem says that with some conditions we have convergence in some sense

$$\frac{\overline{X}_n -\mu}{\sigma/\sqrt{n}}\to N(0,1)$$

On the other hand the law of large numbers says that if $X_i$ are iid random variables with $EX_i=\mu$, then

$$\lim_{n\to\infty}P(|\overline{X}_n-\mu|<\epsilon)=1$$

Can the central limit theorem be used to prove a form of the law of large number? (obviously weakened since it would have the assumption of existence of $\operatorname{Var} X_i$). Since we expect $\frac{\overline{X}_n -\mu}{\sigma/\sqrt{n}}$ to look like $N(0,1)$ for large $n$, shouldn't we expect $\overline{X}_n -\mu$ to look like $N(0,\sigma^2/n)$. So then the variance of $\overline{X}_n -\mu$ being $\sigma^2/n$ means that the variance is becoming very small and $\overline{X}_n -\mu$ is focusing around $0$ and so something like the law of large numbers should hold.

Best Answer

This argument works, but in a sense it's overkill. You have a finite variance $\sigma^2$ for each observation, so $\operatorname{var}\left(\overline{X}_n\right)=\sigma^2/n$. Chebyshev's inequality tells you that $$ \Pr\left(\left|\overline{X}_n - \mu\right|>\varepsilon\right) \le \frac{\sigma^2}{\varepsilon^2 n} \to 0\text{ as }n\to\infty. $$ And Chebyshev's inequality follows quickly from Markov's inequality, which is quite easy to prove.

But the proof of the central limit theorem takes a lot more work than that.

Related Question