Proof of more general strong law of large numbers for dependent random variables

law-of-large-numberslimitsprobabilityprobability theoryprobability-limit-theorems

I want to show the following version of the strong law of large numbers. This is the longest proof I've ever attempted, and I feel a bit overwhelmed.

Let $X_i$ be a sequence of real-valued
random variables with $E(X_i^2)<\infty$ for all $i$. If there is a sequence $c_k\subseteq[0,\infty)$ such that $C=\sum_{k=0}^\infty c_k <\infty$ and $\operatorname{Cov}(X_i,X_j)\le c_{|i-j|}$, then
$$\lim_{n\rightarrow\infty}\frac1n\sum_{i=1}^n(X_i-E(X_i))=0$$

I have the following proof outline provided to me:

    1. Show $V(\sum_{a+1}^bX_i)\le 2(b-a)C$ (Done)
    1. Subsequence: Show
      $$\lim_{k\rightarrow\infty}\frac1{k^2}\sum_{i=1}^{k^2}(X_i-E(X_i))=0\text { a.s.}$$
    1. Fluctuation: Show
      $$\lim_{k\rightarrow\infty}\frac1{k^2}\max\left\{\left\lvert\sum_{i=k^2+1}^m(X_i-E(X_i))\right\rvert:m\in\{k^2,…,k^2+2k\}\right\} = 0\text{ a.s.}$$
    1. Deduce the claim

I have successfully shown (1). I'm not sure it's correct, but I think I've shown (2) like so: For sufficiently small $\delta$ we have
$$\begin{aligned}P\left(\left\lvert\lim_{k\rightarrow\infty}\frac1{k^2}\sum_{i=1}^{k^2}(X_i-E(X_i))\right\rvert > 0\right)
&= P\left(\left\lvert\lim_{k\rightarrow\infty}\frac1{k^2}\sum_{i=1}^{k^2}(X_i-E(X_i))\right\rvert \ge \delta\right)\\
&\le \lim_{k\rightarrow\infty}\frac1{k^4}\frac{V(\sum_{i=1}^{k^2}(X_i-E(X_i))}{\delta^2} = 0
\end{aligned}$$

Is this valid? I'm not sure about the way I rewrote the inequality and pulled the limit out of the variance.

I have no ideas for how to show (3), I don't know how to "break" that maximum. Also, the final step is not obvious to me, again because the max-statement confuses me.

How can I close out this proof? Does someone know a reference for a proof of this law?

Best Answer

First of all, let us make a simplification. Set $Y_i=X_i-EX_i$ and note that $EY_i=0$ and $Cov(Y_i, Y_j)=Cov(X_i, X_j).$ The conclusion of your theorem can be recasted as $\frac{1}{n}\sum_i^n Y_i\to 0$ almost surely.

The first step in such proofs is to show a weaker result, namely, we establish that $S_n:=\frac{1}{n}\sum_i^n Y_i\to 0$ in probability. The usual tool for showing the convergence in probability is Chebyshev's inequality. In fact, once you have a quantitative estimate coming from Chebyshev's inequality, one can write an explicit sequence $k_n$ such that $S_{k_n}\to 0$ almost surely (this argument usually requires Borel-Cantelli). In your case (we will see), suffices to take $k_n=n^2.$

Once you have got the a.s. convergence along subsequence $k_n,$ you want to strengthen it along the full sequence. The argument for this part usually goes as follows. For any $m,$ let $k_{n_m}$ be such that $k_{n_m}\le m\le K_{n_{m+1}}.$ Now, you know that $|S_m|\le |S_m-S_{k_m}|+|S_{k_m}|.$ We already know that $S_{k_m}\to 0$ a.s., therefore, suffices to show that $|S_m-S_{k_m}|\to 0$ almost surely. This is the fluctuation. In simple cases, like the present, it is often handled by using Chebyshev's inequality and applying Borel-Cantelli.

Let us now see the above principles in action. You already have $V(\sum_{i=m+1}^{n}Y_i)\le 2(n-m)C.$ Use the Chebyshev's inequality to obtain $$P\left(\frac{1}{n}|\sum_{i=1}^{n}Y_i|\ge \delta\right)=\frac{2nC}{n^2\epsilon^2}=\frac{2C}{n\epsilon^2}.$$

This already tells us that $S_n\to 0$ in probability. It also tells us that if we take $k_n=n^2,$ then $P(S_{k_n}\ge \epsilon)\le \frac{2C}{n^2\epsilon^2}.$ Applying Borel-Cantelli, we obtain therefore that $S_{k_n}\to 0$ almost surely. (Note that any sequence $k_n$ such that $\sum k_n^{-1}<\infty$ would work.) This corresponds to Step 2 in your outline.

Before going to the third step, first note that $$|S_m|\le |S_m-S_{k^2}|+|S_{k^2}|$$ where $k$ is such that $k^2\le m<(k+1)^2.$

The second term namely $S_{k^2}\to 0$ almost surely. So we are concerned about the first term. Let us look at it more closely. Observe that $$|S_m-S_{k^2}|\le |\sum_{i=1}^{k^2}Y_i(\frac{1}{k^2}-\frac{1}{m})|+\frac{1}{m}|\sum_{i=k^2+1}^{m}Y_i|\le |S_{k^2}|+\frac{1}{m}|\sum_{i=k^2+1}^{m}Y_i|.$$

Once again $S_{k^2}\to 0$ almost surely. Therefore, we now focus on the 'fluctuation' $\frac{1}{m}|\sum_{i=k^2+1}^{m}Y_i.$ Apply Chebushev's inequality to obtain $$P(\frac{1}{m}|\sum_{i=k^2+1}^{m}Y_i|>\epsilon)\le \frac{Var(\sum_{i=k^2+1}^{m}Y_i)}{m^2\epsilon^2}\le \frac{2C(m-k^2)}{m^2\epsilon^2}\le \frac{4Ck}{k^4\epsilon^2}\le \frac{C_{\epsilon}}{k^3}.$$

Now Borel-Cantelli lemma gives you that $\frac{1}{m}|\sum_{i=k^2+1}^{m}Y_i|\to 0$ almost surely. Thus completing the proof.

Related Question