[Math] Moment generating function of sum of independent random variables

density functionmoment-generating-functions

Let $X_1,\ldots,X_n$ be identically and independently distributed as exponential with density $f(x)=e^{-x}$.

a. What is the moment generating function of $X_1$?

b. What is the moment generating function of $Y=\sum_{i=1}^n X_i$

c. What is the density function of $Y$?

For a. I get $M_{X_1}(t) = E(e^{X_1t})=\int_{-\infty}^\infty e^{X_1t}e^{-X_1}dX_1$, but this doesn't make sense since it would be infinite. Is there some way to tighten the bounds so that it becomes finite?

Then for b. if you plug it in the same way as $E(e^{Yt})$, you get a product of moment generating functions like the one above, but again if you integrate from negative infinity to infinity, it's not finite.

For c. I know you're somehow supposed to use the result from b. to derive the density function, but I am not exactly sure how.

Best Answer

a) Your integral bounds are wrong. $E[e^{tX}] = \int_0^\infty e^{tx} e^{-x} dx$.

b) When you add independent random variables, their moment generating functions multiply. This is just noting $E[e^{t \sum_{i=1}^n X_i}] = E[ e^{t X_1} \ldots e^{t X_n}] = E[e^{t X_1}] \ldots E[e^{t X_n}]$ where the first equality is just a property of the exponential and the last equality follows from independence of $X_1,\ldots,X_n$ (note that they do not need to be identically distributed, just independent).

c) Look up the moment generating function for an Erlang distribution or more generally, Gamma distribution and compare it to the moment generating function you got in part (b). Alternatively, you can take an inverse Laplace transform of the moment generating function to get the density or plug in $t=i \omega$ and take a Fourier transform to get the density.

Related Question