[Math] $(X_n)_{n\in\mathbb{N}}$ independent Cauchy-distributed random variables. Convergence of $n^{-\gamma}(X_1+\cdots+X_n)$

characteristic-functionsprobability theoryproof-verificationweak-convergence

I want to solve the following exercise but i am unsure if my ideas are correct or not.

Let $(X_n)_{n\in\mathbb{N}}$ be i.i.d. random variables with probability density
$$
f_a(x)=\frac{1}{a\pi}\frac{1}{1+(x/a)^2},
$$ i.e. each $X_n$ is Cauchy-distributed.

a) For which $\gamma$ does $n^{-\gamma}(X_1 + \cdots + X_n)$ converge in distribution?

b) Does $( n^{-1}(X_1 + \dots + X_n) )_{n\in\mathbb{N}}$ converge almost surely? If not, why not?


My ideas so far:

a) Define $S_n:=X_1 + \dots + X_n$. Because of the $X_i$'s independency $S_n$ is Cauchy-distributed with parameter $na$. I tried to use Lévy's continuity theorem and characteristic functions.
The characteristic function of $S_n$ is $\varphi_{S_n}(t) = e^{-na|t|}$ and the characteristic function of $n^{-\gamma}S_n$ is $\varphi_n(t):=\varphi_{n^{-\gamma}S_n}(t) = e^{-n^{1-\gamma} |t|}$.

Now i look at several cases:

Let $\gamma = 1$. Then $\varphi_n(t) = e^{-|t|}$ and thus $n^{-1}S_n$ is Cauchy-distributed with parameter $1$ (independent of $n$) which implies that the sequence converges in distribution.

Now let $\gamma > 1$. Then
$$
\lim_{n\rightarrow \infty} \varphi_n (t) = \lim_{n\rightarrow \infty} e^{-n^{1-\gamma} |t|} \underset{1-\gamma < 0}{=} 1
$$
for all $t\in\mathbb{R}$.
I'm not sure about this though. What kind of distribution has a constant characteristic function? Based on my knowledge about Fourier transform i guess that this has something to do with the Dirac-Delta.

Finally let $\gamma < 1$. Then
$$
\lim_{n\rightarrow \infty} \varphi_n(0) = \lim_{n\rightarrow \infty} e^0 = 1
$$
and for $t\neq 0$
$$
\lim_{n\rightarrow \infty} \varphi_n(t) = \lim_{n\rightarrow \infty} e^{-n^{1-\gamma} |t|} \underset{1-\gamma > 0}{=} 0.
$$

The characteristic function converges point-wise but the limit function isn't continuous at $0$ and by Lévy's continuity theorem we conclude that the sequence doesn't converge in distribution.


Now part b):
My intuiion says that the sequence ($n^{-1} S_n$ is the arithmetic mean) does not converge and that it has something to do with the fact that a Cauchy-distributed random variable does not have an expected value.

Thanks in advance for any tips or help in general.

Best Answer

Once you've shown that $n^{-1} S_n$ has the same distribution as $X_1$, then it's pretty much immediate that $n^{-\gamma} S_n$ does not convergence in distribution for $\gamma < 1$. If $\gamma > 1$ the characteristic function argument shows that $n^{- \gamma} S_n \stackrel{\text{d}}{\to} 0$. So $n^{-\gamma} S_n$ converges in distribution for $\gamma \geq 1$.

For the second part, since $\text{E}(|X_1| / k) = \infty$ for every $k \in \mathbb{N}$,

\begin{align} \sum_{n=1}^{\infty} P(|X_n| / k > n) &= \sum_{n=1}^{\infty} P(|X_n| / n > k) \\ &= \infty , \end{align}

which by Borel-Cantelli and independence implies $P(|X_n| / n > k \,\, \text{i.o.}) = 1$ for every $k$. Therefore $\limsup_{n \to \infty} |X_n| / n \stackrel{\text{a.s.}}{=} \infty$, which in turn implies $\limsup_{n \to \infty} |S_n| / n \stackrel{\text{a.s.}}{=} \infty$, so $n^{-1} S_n$ does not converge almost surely.