Convergence in probability implies boundedness of means

convergence-divergencemeasure-theoryprobability theory

Say we have a sequence of random variables $(X_n)$ with mean $\mu_n$ and variance $\sigma_n^2$, that converges in probability towards a random variable $X$. Can we reason that mean and variance sequences are bounded? Do we need some other assumptions, for example, $E(|X|)<\infty$, or perhaps if we assume that each $X_n$ is normal? Thank you in advance.

Best Answer

Let $X$ be a non-negative r.v. with infinite expectation. Let $X_n=XI_{\{X\leq n\}}$ Then $X_n \to X$ a.s., hence also in probability, but $EX_n \to \infty$.
If $X_n$'s are normally distributed then $(\mu_n)$ and $(\sigma_n)$ are convergent, hence bounded. [An easy way of proving this is to use characteristic functions: $E^{i\mu_nt} e^{-t^{2}\sigma_n^{2}} \to Ee^{itX}$ and $Ee^{itX}\neq 0$ if $t$ is positive and sufficiently close to $0$ so we can take absolute values and then logarithms to finish].