Modifications of Weak Law of Large Numbers

probability theorystatistics

Let $X_1,…,X_n$ be random variables with finite variances $\text{Var}[X_i]=\sigma_i^2$ and expectations $\mathbb{E}[X_i]=\mu_i$ for all $i=1,…,n$.

The weak law of large numbers that I know states, that for

$\overline {X}_{n}={\tfrac 1n}\textstyle \sum \limits _{{i=1}}^{{n}}(X_{i}-E({X}_{i}))$

we have

$\lim _{{n\rightarrow \infty }}\operatorname {P}\left(\left|\overline {X}_{n}\right|>\varepsilon \right)=0$

If we have iid. random variables with variance $\sigma^2$ and expectation $\mu$ we get as a consequence

${\displaystyle \overline {X}_{n}\ {\xrightarrow {p}}\ \mu}$

Is it true that for iid. random variables it holds that

${\tfrac 1n}\textstyle \sum \limits _{{i=1}}^{{n}}(X_{i}-\mu)^2{\xrightarrow {p}}\ \sigma^2$ ?

How can it be derived from the version stated in the beginning?

Is there an analogue statement if the random variables are not iid?

Best Answer

The statement that you cite as weak law of large numbers is wrong. Without additional assumptions it is not true that $\lim_{n\to\infty}\operatorname {P}\bigl(\bigl|\overline {X}_{n}\bigr|>\varepsilon \bigr)=0$ for any random variables $X_1,\ldots,X_n$ with finite variances $\text{Var}[X_i]=\sigma_i^2$ and expectations $\mathbb{E}[X_i]=\mu_i$ for all $i=1,...,n$.

Say, you can take $X_1=\ldots=X_n=X$ with $\mathbb E[X]=\mu$, $\operatorname{Var}[X]=\sigma^2>0$ and get $\overline X_n=X-\mu$. This value does not depend on $n$ and does not converge in probability to $0$.

When r.v.'s are i.i.d., the statement is true. There can exists a number of other sufficient conditions for convergence ${\frac 1n}\textstyle \sum \limits _{{i=1}}^{{n}}(X_{i}-E({X}_{i}))\xrightarrow{p} 0$. Say, if $X_i$, $i=1,2,\ldots$ are pairwise independent and $\frac{\sigma_1^2+\ldots+\sigma_n^2}{n^2}\to 0$, it is valid. This is a consequence of Chebyshev inequality.

For i.i.d. case, it is sufficient that the first moment of $X_1$ exists. So for $${\tfrac 1n}\textstyle \sum \limits _{{i=1}}^{{n}}(X_{i}-\mu)^2{\xrightarrow {p}}\ \sigma^2$$ in i.i.d. case it is sufficient that $\mathbb E(X_1-\mu)^2=\sigma^2<\infty$.

Related Question