Kolmogorov’s three-series theorem- what can be said about distribution of $X_1$

probability theory

We assume that $X_n$ is i.i.d. and we know that $\sum_{n=1}^{\infty} X_n$ converges almost surely.
The question is what can be said about $X_1$ distribution.

I wanted to use Kolmogorov's three-series theorem. Then I know that:

  • $\sum_{n=1}^{\infty} P(|X_n|>C)$ converges
  • $\sum_{n=1}^{\infty} Var(Y_n)$ converges
  • $\sum_{n=1}^{\infty} \mathbb{E} (Y_n)$ converges

where $Y_n=X_n \mathbb{1}_{\{|X_n|<C\}}$

But what can be concluded about distribution of $X_1$ from here?

Best Answer

If $\sum X_n$ converges almost surely then $X_n \to 0$ almost surely. Hence $X_n \to 0$ in probability. But then $P(|X_1| >\epsilon) =P(|X_n| >\epsilon) \to 0$, so $P(|X_1| >\epsilon) =0$ for every $\epsilon >0$. This implies $P(|X_1| >0)=0$ which means $X_1=0$ with probability $1$.

If you are keen on using the Three Series Theorem you can argue as follows:

Let $N>0$ and $Y_n=X_nI_{|X_n| \leq N}$. By the Three Series Theorem $\sum var (Y_n) <\infty$. But $Y_n$'s are identically distributed so all the terms in this sum are the same. This gives $var (Y_1)=0$ which means $Y_1$ is a constant random variable. Now $X_1I_{|X_1| \leq N}$ is a constant for each $N$. I will let you verify that $X_1$ itself must be a constant. But then the convergence of $\sum X_n$ shows that the constant must be $0$.

Related Question