Convergence in distribution implies point-wise convergence of MGF

moment-generating-functionsprobability theoryweak-convergence

Let $\mu_n, \mu$ be probability measures on $\mathbb{R}$ such that $\mu_n$ converges in distribution to $\mu$. Let $M_n(s) = \mathbb{E}(e^{sX_n})$ and $M(s) = \mathbb{E}(e^{sX})$ be the respective moment generating functions. Assume that $M_n(s)$ is finite in a common interval $[-s_0, s_0]$, $s_0 > 0$. Does it follow that $M_n(s) \rightarrow M(s)$ in this interval?

I want to know if this holds in order to prove a version of the continuity theorem for MGFs. The issue I am having is that the function $e^{sx}$ is continuous but not bounded, unlike the case for characteristic functions.

Best Answer

If uniform integrability of $\{\exp\left(sX_n\right),n\geqslant 1\}$ takes place, then usingthe fact that uniform integrability and convergence in distribution implies convergence of the expectations.

This holds if we assume that for all $s\in (-s_0.s_0)$, $\sup_{n\geqslant 1}\mathbb E\left[\exp\left(sX_n\right)\right]$ is finite.

However, the convergence may not hold at $s_0$. For example, if $s_0=1$, take a sequence of positive random variables $(Y_n)$ which converges in distribution say to one and such that $\mathbb EY_n$ does not converge to one ($Y_n=n^2$ and $1$ with respective probabilitites $1/n$ and $1-1/n$) and let $X_n=\ln(Y_n)$. In this example, $M_n(s)$ is finite for all $s$ but converges only for $s\leqslant 0$.