[Math] convergence of sequence of random variables

measure-theoryprobabilityprobability theory

What does this expression mean – $\lim_{n\rightarrow\infty} E|X_n-X|=0$? $X_n$ is a sequence of random variables and $X$ is a random variable. What does this expression imply? Can I say that the sequence $X_n$ converges to $X$ in probability and almost sure convergence?

Best Answer

This is called convergence in the mean or convergence in the $L^{1}$-norm. In general, if \begin{eqnarray} \lim_{n \to \infty} \mathbb{E}(| X_{n} - X|^{p}) = 0, \end{eqnarray} then $X_{n}$ is said to converge to $X$ in the $L^{p}$-norm (provided that $\mathbb{E}(|X_{n}|^{p})$ is finite for all $n \geq 1$). Analytically, there are nice implications of such convergence. For example, convergence in an $L^{p}$-norm implies convergence in an $L^{q}$-norm if $p \geq q$. (See http://en.wikipedia.org/wiki/Convergence_of_random_variables).

Markov's inequality states \begin{eqnarray} \mathbb{P}(|X_{n} - X| > \epsilon) \leq \epsilon^{-p} \, \mathbb{E}(|X_{n} - X|^{p}). \end{eqnarray} Thus, $L^{p}$-norm convergence implies convergence in probability.