What effect does the variance of a sequence of random variables $\{X_n\}$ have on their uniform integrability

expected valuemeasure-theoryprobabilityprobability theory

I am wondering how the variance of a sequence of random variables affects their uniform integrability.

Let $\{X_n\}$ be a set of normally distributed random variables with mean 0 and an $n$ dependent variance function $\sigma^2(n)$:
$$
X_n \sim \mathcal{N}(0,\sigma^2(n)).
$$

For what type of function $\sigma^2(n)$ will $\{X_n\}$ not be uniformly integrable? Will a fast decay, or a fast growth of $\sigma^2(n)$ affect the uniform integrability? Or is $\{X_n\}$ always uniformly integrable no matter how the variance changes with n?

Best Answer

If I did not make any computation mistake, we have \begin{align*}\forall a\geq 0, \forall n\in\mathbb{N} : \quad \mathbb{E}\left[|X_n|\mathbf{1}_{\left\{|X_n|>a\right\}}\right] &=\int_a^\infty xP_{|X_n|}(dx) \\ &=\frac{2}{\sqrt{\pi}}\int_a^\infty \frac{x}{\sqrt{2\sigma^2(n)}}\cdot e^{-\frac{x^2}{2\sigma^2(n)}}dx \\ &= \sqrt{\frac{2\sigma^2(n)}{\pi}}\cdot e^{-\frac{a^2}{2\sigma^2(n)}}\end{align*} It follows that $\left(X_n\right)_{n\geq 0}$ is uniformly integrable if and only if $\left(\sigma^2(n)\right)_{n\geq 0}$ is bounded above.