Probability Theory – Almost Sure Convergence and Bounded Expectation

probability theory

Another homework problem that is supposed to be "easy" but I can't see any way around. Any direction at all would be appreciated.

Let $X_{n}$ be iid with $\mathbb{E}[X] = 0$. Define $S_n = \sum\limits_{k=1}^{n} X_k$. Show that $\frac{S_n}{n^{1/p}} \rightarrow 0$ almost surely implies $\mathbb{E}[|X|^{p}] < \infty.$

My initial thoughts were around Jensen's Inequality and convexity, but that only applies to $f(x) = x^p$ for $p \geq 1$. I really have no idea where to go from here.

Best Answer

Here are some indications.

Fact 1: If $S_n/n^{1/p}$ converges to a finite limit, then $X_n/n^{1/p}$ converges to zero.

Fact 2: $\mathrm E(|X|^p)$ is finite if and only if $\sum\limits_n\mathrm P(|X|\ge n^{1/p})=\sum\limits_n\mathrm P(|X_n|\ge n^{1/p})$ converges.

Fact 3: For any sequence $(Y_n)$ of independent random variables, $Y_n\to0$ almost surely if and only if $\sum\limits_n\mathrm P(|Y_n|\ge y)$ converges, for every positive $y$.

For the proof of the problem, use fact 1, then the direct implication in fact 2, and finally the reverse implication in fact 3. This gives a stronger version of the result where one does not assume that $\mathrm E(X)=0$ nor that $S_n/n^{1/p}$ converges almost surely to $0$ but only that it converges almost surely to a finite limit.

Hint for the proof of fact 1: $X_n=S_n-S_{n-1}$.

Hint for the proof of fact 2: Start by recalling or reproving the classical equivalence that $\mathrm E(|X|)$ is finite if and only if $\sum\limits_n\mathrm P(|X|\ge n)$ converges.

Hint for the proof of fact 3: Borel-Cantelli lemma with independence for the direct implication, Borel-Cantelli lemma without independence for the reverse implication.

Related Question