Central limit theorem for randon variables with exponentially decaying covariance

central limit theoremcovarianceprobabilityprobability theoryprobability-limit-theorems

Let $X_1,X_2,…$ be i.i.d. bounded random variables with $\mathbb{E}[X]=0$. In addition, let $C_1,C_2>0$ and $\{d_{i,j}\}_{i,j\in \mathbb{N}}$ such that
$$d_{i,j} = C_1e^{-C_2|i-j|} $$

Now, define
$$Y_n = \sum_{j=1}^n d_{n,j} X_n\cdot X_j $$

Does
$$\frac{Y_1+…+Y_n}{\sqrt{n}} \overset{d}{\longrightarrow} N(0,\sigma^2)?$$

Generally the exponential decay of the covariance of the $Y_i$ is not enough. However, I think that for this type of random variable it should work, but not sure how to prove it.

Thanks!

Best Answer

Under the conditions stated above, the following conclusion is true, \begin{equation*} \frac{1}{\sqrt{n}}\sum_{k=1}^{n}(Y_k-C_1\mathsf{E}[X_1^2])\overset{d}{\longrightarrow} N(0,\sigma^2). \tag{1} \end{equation*} This conclusion can be proved by CLT of array of MD (martingale difference). (cf. P. Hall and C. C. Heyde, Martingale Limit Theory and Its Application, Academic Press(1980), Th.3.2, p.58--). The following is an outline of the proof.

Denote \begin{equation*} W_0=0, \quad W_{k-1}=\sum_{j=1}^{k-1}d_{k,j}X_j , \quad k\ge 2. \end{equation*} Then \begin{gather*} Y_k=\Big(\sum_{j=1}^{k}d_{k,j}X_j \Big)X_k =W_{k-1}X_k+d_{k,k}X_k^2\\ \mathsf{E}[Y_k]=d_{k,k}\mathsf{E}[X_k^2]=C_1\mathsf{E}[X_1^2] \overset{\triangle}{=}m. \end{gather*} Let \begin{gather*} Z_{n,k}=\frac{1}{\sqrt{n}}(Y_k-m), \quad 1\le k\le n, n\ge 1.\\ \mathscr{F}_{k}=\sigma\{ X_j,1\le j\le k\}\vee\mathscr{N},\quad 1\le k\le n, n\ge 1. \end{gather*} Then \begin{equation*} \mathsf{E}[Z_{n,k}|\mathscr{F}_{k-1}] =\frac{1}{\sqrt{n}}(W_{k-1}\mathsf{E}[X_k]+ d_{k,k}\mathsf{E}[X_k^2]-m)=0. \end{equation*} and $ Z=\{ Z_{n,k}, \mathscr{F}_{k}, 1\le k\le n, n\ge 1\}$ is a MD-array(array of martingale difference). Now we verify that the $Z$ satisfy the conditions of $S_n=\sum_{k\le n}X_{n,k} \overset{d}{\to} N(0,\sigma^2)$.

At first, due to the $\{X_i,i\ge 1\} $ are bounded, the $\{W_i,i\ge 1\}, \{Y_i,i\ge 1\} $ are bounded too, and \begin{equation*} \max_{1\le k\le n}|Z_{n,k}|\le \frac{C}{\sqrt{n}} \tag{2} \end{equation*} where and latter the $C$ is a constant, irrelated $k,n$, in different expression $C$ may be different.

Secondly, using direct calculation, the following holds, \begin{align*} \lim_{k\to\infty}\frac1k\sum_{j=1}^{k}W_j&=0, \quad \text{a.s.} \tag{3}\\ \lim_{k\to\infty}\frac1n\sum_{j=1}^{n}W_j^2&=b>0,\quad\text{a.s.} \tag{4} \end{align*}

Hence, \begin{align*} &\mathsf{E}[Z_{n,k}^2|\mathscr{F}_{k-1}]\\ &\quad =\frac1n\mathsf{E}[(W_{k-1}X_k+C_1(X_k^2-\mathsf{E}[X_k^2]))^2 | \mathscr{F}_{k-1}]\\ &\quad =\frac1n[W_{k-1}^2\mathsf{E}[X_k^2]]+C_1^2\mathsf{E}[(X_k^2-\mathsf{E}[X_k^2])^2]\\ &\qquad +2C_1W_{k-1}\mathsf{E}[(X_k^2-\mathsf{E}[X_k^2])X_k],\\ &\sum_{k=1}^{n}\mathsf{E}[Z_{n,k}^2|\mathscr{F}_{k-1}]\\ &\quad = \frac1n \sum_{k=1}^{n}W_k^2\mathsf{E}[X_1^2] + C_1^2 \mathsf{E}[(X_1^2-\mathsf{E}[X_1^2])^2] \\ &\qquad + \frac{2C_1\mathsf{E}[(X_1^2-\mathsf{E}[X_1^2])X_1]}n \sum_{k=1}^{n}W_{k-1}\\ &\quad \to b\mathsf{E}[X_1^2]+C_1^2 \mathsf{E}[(X_1^2-\mathsf{E}[X_1^2])^2] \overset{\triangle}{=}\sigma^2. \tag{5} \end{align*} At last, from (2) and (5), we have \begin{align*} \frac{1}{\sqrt{n}}\sum_{k=1}^{n}(Y_k-C_1\mathsf{E}[X_1^2]) =\sum_{k=1}^{n}Z_{n,k}=S_n\overset{d}{\longrightarrow}N(0,\sigma^2). \end{align*} i.e. (1) is true.

Related Question