[Math] Limit of Sum of Cauchy Random Variables

measure-theorynormal distributionprobabilityprobability distributionsprobability theory

I'm investigating the behaviour of some random variables obtained from standard Cauchy random variables $X_n$. Suppose $Y_n=\textrm{sgn}(X_n)|X_n|^{\alpha}$ for $\alpha\in[0,1]$. Let $S_n=Y_1+\dots+Y_n$. It's fairly easy to see that $S_n/n$ converges in distribution to another Cauchy random variables for $\alpha=1$, by simply considering the characteristic function.

My question is, how would I compute the limit of $S_n/n$ for $\alpha<1$? The characteristic functions of the $Y_n$ seem a bit awkward here, and the Central Limit Theorem only tells me about $S_n/\sqrt{n}$. The only thing I've gleaned thus far is that the $Y_n$ have mean $0$ in the case $\alpha<1$, and are obviously iid since the $X_n$ are. Any help would be much appreciated!

Best Answer

For the sake of having an answer:

When $\alpha\lt1$, the random variables $Y_n$ are integrable hence $S_n/n\to0$ almost surely, in $L^1$, in probability and in distribution.