[Math] Convergence to infinity of a sum of independent random variables

central limit theoremconvergence-divergenceprobabilityprobability theory

I am doing an exercise which says:

Suppose $(X_n)$ is a sequence of independent random variables (not necessarily identically distributed) with finite variances. Write $S_n:= \sum_{i=1}^n X_j$ for the partial sums. Suppose

  1. $(X_n)$ is uniformly bounded (i.e. there exists some $M > 0$ such that for all $n \in \mathbb{N} \quad |X_n| \leq M$ a.s.)
  2. $\textrm{var} (S_n) \rightarrow \infty$ as $n \rightarrow \infty$.

Show that:

$$\forall A>0 \quad \lim_{n \rightarrow \infty} \mathbb{P}(|S_n| \leq A) = 0. $$

Progress so far: Write $\alpha_n:= \mathbb{E} (S_n)$. Then using the central limit theorem it is easy to show that

$$\forall A>0 \quad \lim_{n \rightarrow \infty} \mathbb{P}(|S_n – \alpha_n| \leq A) = 0.$$

However, I cannot see how to use this to prove the original result, or what else to do.

Many thanks for your help.

Best Answer

We can use a symmetrization trick: let $(X'_j)_{j \geqslant 1}$ and $(X''_j)_{j \geqslant 1}$ be independent copies of the sequence $(X_j)_{j\geqslant 1}$. Using the Lindeberg central limit theorem and exploiting the fact that $|X'_j-X''_j|\leqslant 2M$, we can show that for each positive $A$, $$\lim_{n\to\infty}\mathbb P\left\{\left|\sum_{j=1}^n X'_j-X''_j \right|  \leqslant A \right\} =0.$$ To conclude, notice that for each $A\gt 0$, the inclusion $$\left\{\left|\sum_{j=1}^n X'_j \right| \leqslant \frac A2 \right\}\cap\left\{\left|\sum_{j=1}^n X''_j \right| \leqslant \frac A2 \right\}\subset \left\{\left|\sum_{j=1}^n X'_j-X''_j \right| \leqslant A \right\}$$ holds, and the events in the left hand side are independent and their common probability is $\mathbb P\left\{\left|\sum_{j=1}^n X_j \right| \leqslant \frac A2 \right\}$.

Related Question