Is the moment generating function smooth by definition

derivativesexpected valuemoment-generating-functions

So I'm going through Casella's Statistical Inference, and in Definition 2.3.6 he defines the moment generating function of a random variable $X$ with cdf $F_X$, denoted by $M_X(t)$, as $$M_X(t) = Ee^{tx},$$
provided that the expectation exists for $t$ in some neighborhood of $0$.

In Theorem 2.3.7, he then states that if $X$ has a mgf $M_X(t)$, then the nth moment of $X$ is equal to the nth derivative of $M_X(t)$ evaluated at $0$.

But why do all of these derivatives have to existed? Is it implicitly assumed without it being stated, or does it somehow just follow from the definition above? If so, how?

Best Answer

These derivatives exist as soon as the moment generating function is finite on some interval containing $0$ (i.e. $M_x(t)$ exists for all $t \in [0,\varepsilon)$ where $\varepsilon > 0$). Often such details are glossed over in textbooks because they can sometimes require some more advanced machinery to prove.

In this case, a straightforward proof is possible. For $t_0 \in [0, \varepsilon)$, we have $$\frac1t \mathbb{E}[e^{(t_0 + t)X} - e^{t_0X}] = \mathbb{E}[e^{t_0X}X] + o(t)\mathbb{E}[X^2 e^{t_oX}]$$ by Taylor's theorem. By Holder's inequality, $\mathbb{E}[X^2 e^{t_0 X}] \leq \mathbb{E}[X^{2q}]^{1/q} \mathbb{E}[ e^{p t_0 X} ]^{1/p}$ where $p > 1$ is chosen so that $t_0 p < \varepsilon$ and $1/q + 1/p = 1$.

Since the existence of the moment generating function on any interval containing $0$ implies the existence of moments of all orders, it follows that $\frac{d}{dt}M_X \big|_{t_0} = \mathbb{E}[e^{t_0X} X]$. Repeating this argument will give $$\frac{d^n}{dt^n}M_X \big|_{t_0} = \mathbb{E}[e^{t_0X} X^n]$$ since the presence of $X$ really doesn't make a difference. Evaluating at $t_0 = 0$ gives the result.

Related Question