Weak Law for Dependent Sequence with bounded variance

analysislaw-of-large-numbersmeasure-theoryprobabilityprobability theory

I was wondering if the Weak Law of Large Numbers also holds true for dependent identically distributed random variables with constant mean and bounded variance.

Let $X_1,X_2\dots X_n$ be a sequence of dependent but identically distributed random variables with $E(X_i)=\mu$ and $Var(X_i) \le B_i$. Then is it true that, $$n^{-1}\sum\limits_{i=1}^nX_i \to \mu, \text{
in probability }$$

If we were to use the Chebyshev's inequality for some $\epsilon >0 $, $$P \left( \left | {n^{-1}\sum\limits_{i=1}^nX_i -\mu } \right | \ge \epsilon \right) \le \frac{Var \left(n^{-1}\sum\limits_{i=1}^nX_i \right )}{\epsilon^2}= \frac{\sum\limits_{i,j}Cov(X_i,X_j)}{n^2\epsilon^2}$$

But by Cauchy Schwarz, $$Cov(X_i,X_j)\le \sqrt{Var(X_i)}\sqrt{Var(X_j)}\le \sqrt{B_iB_j}$$
and this ensures, $$0 \le P \left( \left | {n^{-1}\sum\limits_{i=1}^nX_i -\mu } \right | \ge \epsilon \right)\le \frac{\sqrt{B_iB_j}}{n^2\epsilon^2}$$ and hence by the Sandwich Theorem, we have convergence in probability.

Are these arguments valid? Is my conclusion on assuming constant mean and bounded variance are all that is needed to ensure WLLN to be true for dependent sequence of random variables?

Thanks

Best Answer

Here is a counterexample. Let $X_1\sim N(0,1)$ and $X_n=X_1$ for all $n\in\mathbb{N}$. Then they are dependent and all identically distributed (standard normal) with the same finite mean and variance. However, the sample mean clearly does not converge in probability to $0$, because it is always equal to $X_1$, which is a.s. non-zero.

The problem in your proof is replacing the sum of covariances by a bound for a single covariance. If instead we have that $Var (S_n)\to 0$, in probability, where $S_n$ denotes the sample mean, then your argument does hold.

Intuitively, this means that the covariances must die out quickly enough. That is, we need some kind of fading memory property. Some examples are weak dependence, ergodicity, mixing and martingale difference sequences.

Related Question