[Math] Checking the Lindeberg condition (central limit theorem)

central limit theoremnormal distributionprobability distributionsprobability theoryprobability-limit-theorems

Problem. Let $W_1, W_2,…$ be independent and identically distributed random variables such that $E(W_1)=0$ and $\sigma^2 := V(W_1) \in (0,\infty)$. Let $T_n = \frac{1}{\sqrt{n}} \sum_{j=1}^n a_j W_j$ where $a_j\neq 0$ for all $j\in \Bbb{N}$. If $$\lim_{n\to \infty} \frac{\max_{j=1,…,n}|a_j|}{\sqrt{\sum_{j=1}^na_j^2}}=0,$$ then $$\frac{T_n}{\sqrt{V(T_n)}} \longrightarrow_d N(0,1) \quad\text{(convergence in distribution}).$$

Here is my attempt:
If we define $X_{nj}= \frac{a_j}{\sqrt{n}}W_j$ for $j=1,…,n$, then $T_n=\sum_{j=1}^n X_{nj}$. So if we can check that the Lindeberg condition holds for this triangular array, then the central limit theorem of Lindeberg-Feller implies the claim that $\frac{T_n}{\sqrt{V(T_n)}} \longrightarrow_d N(0,1)$.
To this end, we need to show that for any $\varepsilon>0$ $$\lim_{n\to \infty} \frac{1}{\sigma_n^2} \sum_{j=1}^n E( X_{nj}^2 \cdot \mathbf 1\{ |X_{nj}|\gt \varepsilon \sigma_n \})=0$$ where $\sigma_n^2=\sum_{j=1}^n V(X_{nj}) = \frac{\sigma^2}{n}\sum_{j=1}^n a_j^2$.

I have not been able to prove this and would be very thankful for any help.

Best Answer

We consider $m_n=\max\limits_{j=1}^n|a_j|$, $s_n^2=\sum\limits_{j=1}^na_j^2$, $W$ some random variable distributed like every $W_j$, and we simplify the quantities involved in Lindeberg's condition as follows; $$\begin{align} \sum_{j=1}^nE( X_{nj}^2 \cdot \mathbf 1\{ |X_{nj}|\gt \varepsilon \sigma_n \}) &= \sum_{j=1}^n \frac{a_j^2}{n} E\left( W^2 \cdot \mathbf 1\{ |W|\gt \frac{\varepsilon \sigma_n \sqrt n}{|a_j|} \}\right) \\&\leq\frac{s_n^2}{n}E\left( W^2 \cdot \mathbf 1\{ |W|\gt \frac{\varepsilon \sigma_n \sqrt n}{m_n} \}\right) \\&=\frac{s_n^2}{n}E\left( W^2 \cdot \mathbf 1\{ \frac{|W|}{\sigma}\gt \frac{\varepsilon s_n}{m_n} \}\right) \end{align}$$ Since $1/\sigma_n^2=n/(\sigma^2 s_n^2)$, this yields $$\begin{align} \frac{1}{\sigma_n^2} \sum_{j=1}^n E( X_{nj}^2 \cdot \mathbf 1\{ |X_{nj}|\gt \varepsilon \sigma_n \}) &\leq \frac{1}{\sigma^2} E\left( W^2 \cdot \mathbf 1\{ \frac{|W|}{\sigma}\gt \frac{\varepsilon s_n}{m_n} \}\right) \\&= E\left( \frac{W^2}{\sigma^2} \cdot \mathbf 1\{ \frac{|W|}{\sigma}\gt \frac{\varepsilon s_n}{m_n} \}\right) \quad(*) \end{align}$$ Since $m_n/s_n\to0$ when $n\to\infty$, one knows that $\mathbf 1\{ |W|/\sigma\gt \varepsilon s_n/m_n \}\to0$ almost surely, when $n\to\infty$. Since $W$ is square integrable, by Lebesgue dominated convergence theorem, the expression $(*)$ converges to $0$ as $n\to \infty$, QED.

Related Question