Showing that the $r^{th}$ derivative of the moment generating function is the $r^{th}$ raw moment

expected valuemoment-generating-functionsprobability

Is this a sound way of defining the $r^{th}$ derivative for the moment generating function $M_{X}(t)$—in order to show that when evaluated at $0$ it gives us the $r^{th}$ raw moment $E[X^{r}]$ ?

Possible Derivative Expression

$M_{X}^{(r)}(t)=\sum_{j<r} E\left(X^{j}\right) \frac{t^{j-r}}{(j-r) !}+E\left(X^{r}\right)+\sum_{j>r} E\left(X^{j}\right) \frac{t^{j-r}}{(j-r) !}$

So that $M_{X}^{(r)}(0)= E\left(X^{r}\right)$.

I guess my question is if this is the right way of expressing the $r^{th}$ derivative of the Maclaurin series for the MGF ?

Definitions:

Moment generating function

$M_{X}(t)=E\left(e^{t X}\right)$

Maclaurin Polynomial expression for MGF

$
M_{X}(t)=1+t E(X)+\frac{t^{2}}{2 !} E\left(X^{2}\right)+\ldots+\frac{t^{r}}{r !} E\left(X^{r}\right)+\ldots=\sum_{j=0}^{\infty} \frac{E\left(X^{j}\right)}{j !} t^{j}
$

Best Answer

Thanks @Alex and @BruceET.

To correct the original question, the $j<r$ terms disappear after taking the derivative $r$ times.

This becomes clear when computing it by hand for say $r=3$. Note that when taking the $r^{th}$ derivative, every term $E(X^{s})$—where $s \in [0,r)$—has been successively eliminated during differentiation since $\frac{d}{dt}E(X^{r-1})=0$.

So the correct formulation is then

$$M_{x}^{(r)}(t)=E\left(x^{r}\right)+\sum_{j=r+1}^{\infty} \frac{E\left(X^{j}\right) t^{j-r}}{(j-r) !}$$

or

$$ \sum_{j=r}^{\infty} \frac{E\left(X^{j}\right) t^{j-r}}{(j-r) !} $$