Proving a “Central Limit Theorem” for an iid sequence with infinite variance

central limit theoremcharacteristic-functionsprobability distributionsprobability theory

I'm working on the following exercise:

Let $X_1, X_2, \ldots$ be i.i.d. random variables with density $$
f(x) = \frac 1{|x|^3} \mathbb 1_{\mathbb R \setminus [-1,1]}(x).$$
Then $\mathbb E[X_1^2] = \infty$, but there are numbers $A_1, A_2, \ldots$ such that the distribution of $(X_1 + \cdots + X_n)/A_n$ converges weakly to the standard normal distribution $\mathcal N_{0,1}$ with density $\frac 1{\sqrt{2\pi}}e^{-x^2/2}$. Determine one such sequence $(A_n)$ explicitly.

What I've tried: My strategy has involved following the steps of proving the Feller-Lindeberg Central Limit Theorem. Let $\varphi_{n,l}(t)$ denote the characteristic function of $X_{n,l} := \frac{X_l}{A_n}$, and $\varphi_{n}(t)$ the characteristic function of $(X_1 + \cdots + X_n)/A_n =: S^*_n$. To show the distribution $\mathbb P_{S^*_n}$ converges weakly to $\mathcal N_{0,1}$, we want to show:
$$
\lim_{n \to \infty} \log\varphi_n(t) = -\frac{t^2}2
$$

and then apply Lévy’s Continuity Theorem.

I was able to show that $X_{n,l}$ forms a null array if $A_n \to \infty$ as $n \to \infty$, and so was able to prove:

Lemma:$$\left| \log\varphi_n(t) – \sum_{l=1}^n \mathbb E\left[e^{itX_{n,l}} – 1\right]\right| = 0.$$

With this lemma, it’s enough to show that
$$
\sum_{l=1}^n \mathbb E\left[e^{itX_{n,l}} – 1\right] = -\frac{t^2}{2}.
$$

If $d\mathbb P_{X_l}/dx = |x|^{-3}\mathbb 1_{\mathbb R \setminus[-1,1]}(x)$, and $X_{n,l} = \frac{X_l}{A_n}$, one can show
$$
\frac{d\mathbb P_{X_{n,l}}}{dx} = \frac{1}{A_n^2 |x|^3} \mathbb 1_{\mathbb R\setminus[-\frac{1}{A_n},\frac{1}{A_n}]}(x).
$$

It follows that
$$ \sum_{l=1}^n \mathbb E\left[e^{itX_{n,l}} – 1\right] = \sum_{l=1}^n \int_{\mathbb R \setminus[-1,1]} (e^{itx}-1)\,\mathbb P_{X_{n,l}}(dx) =\frac{n}{A_n^2} \int_{\mathbb R \setminus[-\frac{1}{A_n},\frac{1}{A_n}]} \frac{e^{itx}-1}{|x|^3} \, dx
$$

My problem: From here, I’m not sure how to choose the $A_n$ so these integrals converge to $-t^2/2$. All the proofs I know of the Central Limit Theorem and related results use the Taylor expansion of $e^{itx} – 1$, and these rely strongly on the fact that $\mathbb E[X_{n,l}^2] < \infty$ in those cases. Is there a better way to prove the existence of these $A_n$?

EDIT: See comments for the answer.

Best Answer

The following is an answer of your question, it use the general results of limit theorems for sum of independent summands.

Let \begin{gather*} Y_{nk}=\frac{X_k}{A_n}, \qquad 1\le k\le n, \quad n\ge 1,\\ S_n=\sum_{k=1}^{n}Y_{nk}, \end{gather*} where $ \{X_n, n\ge 1 \} $ are i.i.d. random variables with density \begin{equation*} f(x)=\frac{1_{\{|x|\ge 1\}}(x)}{|x|^3}. \end{equation*} Since the $ X_i's $ are distributed symmetrically, the general result of $ S_n $ converge to normal law could be the following (cf. B. V. Gnedenko & A. N. Kolmogorov, Limit distributions for sums of Independent Random Variables, Addison-Wesley Publishing Company(1968), p.128, Theorem 5.2.),

Theorem In order that the distributions of $ S_n $ converge as $ n\to\infty $ to the normal law $ N(0,1) $ and the summands $ Y_{nk}(1\le k\le n) $ be infinitesimal, it is necessry and sufficient that the folllowing two conditions be satisfied for every $ \epsilon>0 $, as $ n\to\infty $:

(1) $ \displaystyle \sum_{k=1}^n\mathsf{P}(|Y_{nk}|>\epsilon)\to 0, $

(2) $ \displaystyle\sum_{k=1}^n\mathsf{Var}[Y_{nk}1_{|Y_{nk}|\le \epsilon}]\to 1. $

Now let \begin{equation*} A_{n}=\sqrt{n\log n}. \tag{1} \end{equation*} then $ A_n\to\infty $, as $n\to\infty$. Furthermore, \begin{align*} &\sum_{k=1}^n\mathsf{P}(|Y_{nk}|>\epsilon)= n \mathsf{P}\Big(\frac{|X_1|}{A_n} > \epsilon\Big)=n\int_{(\epsilon A_n) \vee 1 }^\infty \frac{2}{x^3}\,\mathrm{d}x\\ &\quad =\frac{n}{\epsilon^2A_n^2}\to 0 \quad \text{as }n\to\infty,\quad \forall \epsilon>0.\\ & \sum_{k=1}^n\mathsf{E}[Y_{nk}^21_{|Y_{nk}|\le \epsilon}] =\frac{n}{A_n^2}\mathsf{E}[X_1^2 1_{\{|X_1|\le \epsilon A_n\}} ] =\frac{n}{A_n^2}\int_{1}^{1\vee (\epsilon A_n)}\frac2x\,\mathrm{d}x\\ &\quad =\frac{2n \log (\epsilon A_n)}{A_n^2} = \frac{n(\log n+o(\log n))}{n\log n}\to 1, \quad \text{as }n\to\infty,\quad \forall \epsilon>0. \end{align*} Hence, for $A_n$ in (1), $S_n\stackrel{d}{\longrightarrow}N(0,1)$.

Related Question