A question about moment generating function

moment-generating-functionsprobabilityreference-request

I am reading the lecture notes where proposition 6.3.1. says:

Suppose that the moment-generating function $m_Y(t)$ of a random variable $Y$ admits an expansion into a power series. Then the coefficients are related to the moments of $Y$ in the following way:
$$ m_Y(t)=\sum_{k=0}^\infty\frac{\mu_k}{k!}t^k, $$ where $\mu_k=\mathbb E[Y^k]$ is the $k$-th moment of $Y$.

And it is said: "A fully rigorous argument of this proposition is beyond the scope of these
notes".

But it is followed by the following line:

$$ m_Y(t)=\mathbb E[e^{tY}]=\mathbb E[\sum_{k=0}^\infty\frac{(tY)^k}{k!}]=\sum_{k=0}^\infty t^k\frac 1{k!}\mathbb E[Y^k]=\sum_{k=0}^\infty \frac {\mu_k}{k!}t^k. $$

I think this is already rigorous if say $Y$ is positive by the Fubini-Tonelli theorem. But why does this also hold even when $Y$ is an arbitrary random variable?

Best Answer

You are right about the positive case. In general $Y$ need not have finite moments and even when the moments exits you need conditions on the moments for this to hold.

Suppose $Ee^{c|Y|} <\infty$ for some $c >0$ (equivalently, $\sum \frac {|\nu_k|} {k!} c^{k} <\infty$ where $\nu_k$ is the $k-$th absolute moment of $Y$). Then DCT can be used to justify the equation $m_Y(t)=\sum \frac {\mu_k t^{k}} {k!}$ for $|t| <c$.

Related Question