Probability Theory – Does Convergence in Distribution Imply Convergence of Expectation?

convergence-divergenceprobabilityprobability theory

If we have a sequence of random variables $X_1,X_2,\ldots,X_n$ converges in distribution to $X$, i.e. $X_n \rightarrow_d X$, then is
$$
\lim_{n \to \infty} E(X_n) = E(X)
$$
correct?

I know that converge in distribution implies $E(g(X_n)) \to E(g(X))$ when $g$ is a bounded continuous function. Can we apply this property here?

Best Answer

With your assumptions the best you can get is via Fatou's Lemma: $$\mathbb{E}[|X|]\leq \liminf_{n\to\infty}\mathbb{E}[|X_n|]$$ (where you used the continuous mapping theorem to get that $|X_n|\Rightarrow |X|$).

For a "positive" answer to your question: you need the sequence $(X_n)$ to be uniformly integrable: $$\lim_{\alpha\to\infty} \sup_n \int_{|X_n|>\alpha}|X_n|d\mathbb{P}= \lim_{\alpha\to\infty} \sup_n \mathbb{E} [|X_n|1_{|X_n|>\alpha}]=0.$$ Then, one gets that $X$ is integrable and $\lim_{n\to\infty}\mathbb{E}[X_n]=\mathbb{E}[X]$.

As a remark, to get uniform integrability of $(X_n)_n$ it suffices to have for example: $$\sup_n \mathbb{E}[|X_n|^{1+\varepsilon}]<\infty,\quad \text{for some }\varepsilon>0.$$