Central Limit Theorem – Understanding for Asymptotically i.i.d. Random Variables

asymptoticscentral limit theoremconvergenceiid

I observe a sequence of r.v. $X_1, X_2, \dots$ where each $X_i$ is a function of the sample size $n$.

When $n \rightarrow \infty$ I have the following result: $X_1 \rightarrow^d E_1, X_2 \rightarrow^d E_2, \dots$ where $E_i$ are i.i.d random variable with mean $\mu$, variance $\sigma^2 < \infty$ and same density $f_E$. Moreover,

  • $$\lim_{n \rightarrow \infty}E(X_i) = \mu$$
  • $$\lim_{n \rightarrow \infty}V(X_i) = \sigma^2$$
  • $$\textrm{Cov}(X_i, X_j) = O(n^{-1})$$

Denote $\bar X_n = n^{-1}\sum_{i = 1}^n X_i$, can I claim that (Linderberg-Levy CLT type)

$$
\sqrt{n}\left(\bar{X}_{n}-\mu\right) \stackrel{d}{\rightarrow} \mathcal{N}\left(0, \sigma^{2}\right) ?
$$

In other words: if the dependence of the elements of the summation fades only asymptotically, does the CLT still holds?

Best Answer

The problem here is that convergence in distribution is too weak and provides no insurance against correlations between the $X_i$. The condition that the $E_i$ are independent does not help. In fact, the $X_i$ could all be the exact same variable and still converge in distribution to the iid $E_i$. You will need a stronger form of convergence to get the $X_i$ to become less dependent as $n$ increases - convergence in probability at minimum, or perhaps something stronger.

The condition of $O(n^{-1})$ decaying correlation unfortunately doesn't help either, as the central limit theorem does not hold on merely uncorrelated random variables.

For what it's worth, some central limit theorems work under different assumptions than Lindeberg-Lévy, such as the martingale CLT.

Related Question