Probability Theory – Weak Law of Large Numbers for Dependent Random Variables with Bounded Covariance

covariancelaw-of-large-numbersmeasure-theoryprobability theoryprobability-limit-theorems

I'm currently stuck on the following problem which involves proving the weak law of large numbers for a sequence of dependent but identically distributed random variables. Here's the full statement:

  • Let $(X_n)$ be a sequence of dependent identically distributed random variables with finite variance.

  • Let $\displaystyle S_n = \sum_{i=1}^n X_i $ denote the $n^\text{th}$ partial sum of the random variables $(X_n)$.

  • Assume that Cov$(X_i,X_j) \leq c^{|i-j|}$ for $i, j \leq n$ where $|c| \leq 1$.

Is it possible to show that $\displaystyle \frac{S_n}{n} \rightarrow \mathbb{E}[X_1]$ in probability? In other words, is it true that given any $\epsilon>0$,

$$ \lim_{n\rightarrow \infty} \mathbb{P}\bigg[\Big|\frac{S_n}{n} – \mathbb{E}[X_1]\Big| > \epsilon\bigg] = 0$$

EDIT: Following some comments, it turns out that I had the right approach so I've gone ahead and answered my own question below.

Best Answer

Fix $\epsilon > 0$ and $n \in \mathbb{N}$, then we can use Chebyshev's inequality to see that

$$\mathbb{P}\bigg[\Big|\frac{S_n}{n} - \mathbb{E}[X_1]\Big| > \epsilon\bigg] \leq \frac{\text{Var}\Big(\frac{S_n}{n}\Big)}{\epsilon^2}$$

where

$$\displaystyle \text{Var}\Big(\frac{S_n}{n}\Big)= \frac{\text{Var}(S_n)}{n^2} \leq \frac{\sum_{i=1}^n\sum_{j=1}^n \text{Cov}{(X_i,X_j)}}{n^2} \leq \frac{\sum_{i=1}^n\sum_{j=1}^n c^{|i-j|}}{n^2} $$

We can then explicitly calculate the double sum $\sum_{i=1}^n\sum_{j=1}^n c^{|i-j|}$ as follows:

$$\begin{align} \sum_{i=1}^n\sum_{j=1}^n c^{|i-j|} &= \sum_{i=1}^n c^{|i-i|} + 2\sum_{i=1}^n\sum_{j=1}^{i-1} c^{|i-j|} \\ &= n + 2\sum_{i=1}^n\sum_{j=1}^{i-1} c^{|i-j|} \\ &= n + 2\sum_{i=1}^n\sum_{j=1}^{i-1} c^{i-j} \\ &= n + 2\sum_{i=1}^n c^i \frac{1 - c^{-i}}{1-c^{-1}} \\ &= n + 2\sum_{i=1}^n \frac{c^i + 1}{1-c^{-1}} \\ &= n + \frac{2c}{c-1} \sum_{i=1}^n c^{i}-1 \\ &= n + \frac{2c}{c-1} \big(\frac{1-c^{n+1}}{1-c} -n \big)\\ &= n + \frac{2c}{(c-1)^2}(c^{n+1}+1) + \frac{2c}{c-1}n\\ \ \end{align}$$

Thus,

$$\lim_{n\rightarrow\infty} \mathbb{P}\bigg[\Big|\frac{S_n}{n} - \mathbb{E}[X_1]\Big| > \epsilon\bigg] = \lim_{n\rightarrow\infty} \frac{\text{Var}\Big(\frac{S_n}{n}\Big)}{\epsilon^2} \leq \lim_{n\rightarrow\infty} \frac{n + \frac{2c}{(c-1)^2}(c^{n+1}+1) + \frac{2c}{c-1}n}{n^2 \epsilon^2} = 0 $$

Seeing how our choice of $\epsilon$ was arbitrary, the statement above holds for any $\epsilon > 0 $ and shows that $\frac{S_n}{n} \rightarrow E[X_1]$ in probability, as desired.

This shows the validity of the theorem for $c<1$, but not for $c=1$. We can easily extend the demonstration to all cases in which $|\mbox{Cov}(X_i,X_j)|\le f_{|i-j|}$ where $\lim_{i\to\infty}f_i=0$. Indeed in this case it is simple to show that $$\lim_{n\to\infty}{1\over n^2}\sum_{i=1}^n\sum_{j=1}^n \text{Cov}{(X_i,X_j)}\le \lim_{n\to\infty}{1\over n^2}\sum_{i=1}^n\sum_{j=1}^n |\text{Cov}{(X_i,X_j)}|\le \lim_{n\to\infty}{1\over n^2}\sum_{i=1}^n\sum_{j=1}^nf_{|i-j|}=0$$