[Math] Expectation of sequences of random variables that converge to 0 in probability

asymptoticscalculusprobabilityprobability theoryreal-analysis

Let $X_n, n \geq 1$ be a sequence of random variables that converges to zero in probability, that is, $\forall \varepsilon >0$,
$$\lim_{n \to \infty} P(|X_n| < \varepsilon) = 1$$
Moreover, let $X_n=o_p(n^{-1})$, that is, $\forall \varepsilon >0$,
$$\lim_{n \to \infty} P\left(\left|\frac{X_n}{n^{-1}}\right| < \varepsilon\right) = 1,$$
or equivalently,
$\forall \varepsilon, \eta >0$, there exists $n_0$ such that for $n\geq n_0$,
$$P\left(\left|\frac{X_n}{n^{-1}}\right| < \varepsilon\right) \geq 1-\eta,$$

My question is, what can we say about $E(X_n)$ when $n \to \infty$? For instance, is it true that $E(X_n)=o(n^{-1})$? Or more generally, is it true that $E(o_p(n^{-1}))=o(n^{-1})$? How can I prove so?

Best Answer

No, you cannot say anything like that. Consider $(X_n)_n$ defined as $$X_n=\begin{cases}0 &\text{ w.p. } 1- \frac{1}{n}\\ n^2 &\text{ w.p. }\frac{1}{n}\end{cases}$$

Then $\mathbb{E}[X_n] = n\xrightarrow[n\to\infty]{} \infty$ (and tweaking the example above, you can replace the growth rate $n$ by anything you'd like), but, for any $\varepsilon > 0$, $$ \mathbb{P}\{ n\lvert X_n\rvert < \varepsilon \} \geq 1-\frac{1}{n} \xrightarrow[n\to\infty]{} 1. $$