[Math] Strong and weak laws of large numbers

law-of-large-numbersprobabilityprobability theory

Let $X_1,X_2,\ldots$ be a sequence of random variables.

Weak (strong) law of large numbers states that:

If $X_1,X_2,\ldots$ are i.i.d. RVs and they have finite
expectation $m$, then $\frac{X_1+\dots+X_n}{n}\rightarrow m$
stochastically (almost surely).

I wonder if those laws hold without assumption about independence/identical distribution or if we can exchange one assumption with some other one. Thanks for any input.

Best Answer

A theorem by Markov states that if a sequence of random variables $X_1, X_2, \ldots$ with finite variances fulfills one of conditions:

  • $\lim_{n \to \infty} \frac{1}{n^2} \mathrm{Var} \sum_{i = 1}^n X_n = 0$;
  • $X_1, X_2, \ldots$ are independent and $\lim_{n \to \infty}\frac{1}{n^2}\sum_{i = 1}^n \mathrm{Var} X_i = 0$;

then the sequence $Y_n = \frac{1}{n}\sum_{i=1}^n (X_i - \mathsf{E} X_i)$ converges for $n \to \infty$ to $0$ in probability.

In addition, if random variables $X_1, X_2, \ldots$ are identically distributed, have finite variance and are uncorrelated (instead of independent), then the proof of the weak law of large numbers using Chebyshev's inequality still holds.

EDIT: Corrected the first condition, thanks to @Michael.

Related Question