[Math] Rate of convergence in the Law of Large Numbers

limit-theoremspr.probabilityreference-request

I'm working on a problem where I need information on the size of $E_n=|S_n-n\mu|$, where $S_n=X_1+\ldots+X_n$ is a sum of i.i.d. random variables and $\mu=\mathbb EX_1$. For this to make sense, the $(X_i)$ have to be integrable. In that case, the weak law of large numbers says $E_n/n$ converges to 0 in probability, while the strong law says $E_n=o(n)$ almost surely.

If $X_1$ is square integrable, then we get the (stronger) result $E_n/(n^{1/2+\epsilon})$ converges to 0 in probablility. What I am looking for is a result for the case $\mathbb E|X_1|^\alpha<\infty$ for some $1<\alpha<2$. Back-of-the-envelope calculations suggest a result of the form $E_n=O(n^{1/\alpha+\epsilon})$ holds. I suspect a relationship with $\alpha$-stable laws and have tracked down Gnedenko and Kolmogorov's book, but find (1) their notation is quite hard to read; (2) they seem to mainly care about something that's not so important for me; namely a result of the Central Limit Theorem type. They impose extra conditions under which suitably scaled and translated versions of $S_n$ converge to a non-trivial distribution. I don't want to assume anything beyond an $\alpha$th moment inequality, but am looking for a correspondingly weaker conclusion giving upper bounds on $E_n$.

Can anyone point me to some results about deviation from the mean for sums of iid random variables with $\alpha$th moments ($1<\alpha<2$)?

In a similar vein, what if the random variables fail to have an $\alpha$th moment for some $\alpha<1$. Here I'd expect to see some kind of result telling me that if $S_n/a_n$ converges in probability to a constant for some sequence $(a_n)$, then that constant must be 0.

Edit:
Let me give a precise Chebyshev-like statement I would really like to be true.

Let $(X_n)$ be iid; $\mathbb EX_1=0$ and $\mathbb E|X_1|^\alpha<\infty$ for some $1<\alpha<2$.
Then $\mathbb P(|S_n|>t)<C_\alpha n\mathbb E|X_1|^\alpha/t^\alpha$, where $C_\alpha$ is a constant that only depends on $\alpha$ (I have guessed the bound based on "symbol-bashing" with the $\alpha$-stable law and believe it's at least "dimensionally correct".)

Best Answer

An early occurence of such bounds is in the theorem of Theorem of vonBahr and Eseen

vonBahr, B., Esseen C.-G.: Inequalities for the rth absolute moment of a sum of random variables, $1\leq r \leq 2$. Ann. Math. Statist. 36, No.1, 299-393 (1965).

Theorem: Let $X_i$ be independent (not necessarily i.i.d.) zero mean random variables. Then, for $\alpha\in [1,2]$, $$E|\sum_{i=1}^n X_i|^\alpha\leq C_\alpha \sum_{i=1}^n E|X_i|^\alpha\,.$$ ($C_\alpha$ is explicit)

Applying Markov's inequality in your setup gives $$P(|S_n|>t)\leq C_\alpha n E|X_1|^\alpha t^{-\alpha} $$ as you wanted.

Note: the moment condition certainly does not imply convergence to $\alpha$-stable - one would need some regularly varying conditions on the tail for that.

Note2: A good summary (up to late 70s) of estimates of this kind (mostly from the Russian school) are in Nagaev's Annals of Probability paper (1979). Petrov's 1975 book is also a good resource - in particular the VonBahr-Essen bound is mentioned there (on page 60).

Related Question