[Math] Does the convergence of $S_n / \sqrt{n}$ in distribution imply $EX_i = 0$

central limit theoremexpectationprobabilityrandom variables

Let $X_1, X_2, \cdots$ be i.i.d. and let $S_n = X_1 + \cdots + X_n$. If $S_n / \sqrt{n}$ converges in distribution, then can I conclude that $EX_i = 0$ without assumption $EX_i ^2 < \infty$?

If $EX_i ^2 < \infty$, then it easily follows from the central limit theorem. But in this case I cannot use the central limit theorem.

In fact, according to Exercies 3.4.3 in [Probability Theory and Examples, Durrett], convergence of $S_n / \sqrt{n}$ in distribution implies $EX_i^2 < \infty$ and thus $EX_i = 0$. However I'm trying to solve this exercise by using the sketch of proof given in the book. That is, I assume $EX_i^2 = \infty$ and try to find a contradiction, and I need to derive $EX_i = 0$ from convergence in $S_n / \sqrt{n}$ in distribution.

Best Answer

In the second edition of Durrett's book, it is asked to show that necessary, $X_1$ has a finite second moment. The idea given in Durrett's book is to symmetrise and truncate. More explicitely, let $\left(X'_i\right)_{i\geqslant 1}$ be an independent copy of $\left(X_i\right)_{i\geqslant 1}$ and $Y_i:=X_i-X'_i$. Let $A\geqslant 0$ be fixed and let $$U_i:=Y_i\mathbb 1\left\{\left\lvert Y_i\right\rvert\leqslant A \right\} \mbox{ and } V_i:=Y_i\mathbb 1\left\{\left\lvert Y_i\right\rvert\gt A \right\}. $$
Then we have for any $R$, $$\mathbb P\left\{\sum_{i= 1}^n Y_i\gt R\sqrt n \right\} \geqslant \mathbb P\left( \left\{\sum_{i= 1}^n U_i\gt R\sqrt n \right\}\cap \sum_{i=1}^nV_i\geqslant 0 \right) \\ =\frac 12\mathbb P\left\{\sum_{i= 1}^n U_i\gt R\sqrt n \right\} $$ where the last equality holds by symmetry (denoting $U=\sum_{i= 1}^n U_i$, $R'=R\sqrt n$ and $V=\sum_{i= 1}^n V_i$, we have $\mathbb P\left( \left\{\left\lvert U\right\rvert\gt R' \right\}\cap V\gt 0 \right) =\mathbb P\left(U\gt R'\right)$ ). Let us denote by $Z$ the limit of the sequence $\left(n^{-1/2}\sum_{i=1}^nY_i\right)_{n\geqslant 1}$. If $R$ is a continuity point of the cumulative distribution function of $Z$, then we get, by the central limit theorem, $$\mathbb P\left(Z\gt R\right)\geqslant \frac 12\mathbb P\left(\mathbb E\left [U_1^2\right]N\gt R\right)=\frac 12\mathbb P\left(\mathbb E\left [Y_1^2\mathbb 1\left\{\left\lvert Y_i\right\rvert\leqslant A \right\} \right]N\gt R\right) ,$$ where $N$ denotes a random variable following a standard normal distribution. If $\mathbb E\left[Y_1^2\right]$ is infinite, then we would reach a contradiction by letting $A$ going to infinity. To conclude, it suffices to show that

if $X$ and $Y$ are two i.i.d. random variables such that $\mathbb E\left [\left(X-Y\right)^2\right]$ is finite, then $\mathbb E\left[X^2\right]$ is finite.

Related Question