Moment generating function of a random variable dependent on a different one

moment-generating-functionsrandom variables

Assuming I have a random variable $Y \sim \mathrm{U}(0,1)$, and a random variable $X$ has conditional distribution dependent on $Y$:
$X\mid _{Y=p} \; \sim \mathrm{Bin}(n,p)$ (Bernoulli with success probability $p$ and $n$ experiments).
How can I find the moment generating function for $X$?

My idea was to use the fact that $\mathrm{E}(X)=\mathrm{E}(\mathrm{E}(X\mid Y))$, but I get stuck in the calculations (maybe my calculus is lacking, if so I would love to know the trick for solving it).

How would you solve and approach these type of questions?

Any help is greatly appreciated.

Best Answer

Recall the definition of MGF:

$$M_X(t) = \operatorname{E}[e^{tX}].$$ So by the law of total expectation,

$$\operatorname{E}[e^{tX}] = \operatorname{E}[\operatorname{E}[e^{tX} \mid Y]].$$

Now think about what the inner expectation means: $\operatorname{E}[e^{tX} \mid Y]$ is the MGF of the conditional distribution of $X$ given $Y$, because given $Y = p$, $X$ is binomial with parameters $n$ and $p$. So $\operatorname{E}[e^{tX} \mid Y]$ is simply the MGF of a binomial distribution where $p$ has been replaced by $Y$:

$$\operatorname{E}[e^{tX} \mid Y] = (1 + (e^t - 1)Y)^n.$$

Then we take the expectation of this with respect to $Y$, which is continuous uniform on $[0,1]$:

$$M_X(t) = \operatorname{E}[(1 + (e^t - 1)Y)^n].$$ This calculation I leave as an exercise for the reader.