Solved – Properties of moment-generating functions

conditional probabilitymoment-generating-functionprobability

I am new to statistics and I happen to came across this property of MGF:

Let $X$ and $Y$ be independent random variables. Let $Z$ be equal to $X$, with probability $p$, and equal to $Y$, with probability $1 − p$. Then,
$$M_Z(s)= p M_X(s) + (1 − p) M_Y(s).$$

The proof is given that

$$M_Z(s)= E[e^{s Z}]= p E[e^{s X}] + (1 − p)E[e^{s Y}]= p M_X (s) + (1 − p)M_Y (s)$$

But I do not understand, can someone show me a full proof as in showing the conditioning on the random choice between X and Y, as in why the following holds

$$M_Z(s)= E[e^{s Z}]= p E[e^{s X}] + (1 − p)E[e^{s Y}]$$

Thanks very much.

Best Answer

You may also see it this way: consider another Bernoulli RV $\Theta$ which is $0$ with probability $p$ and $1$ with probability $1-p$ (so $P(\Theta = 0) = p, P(\Theta = 1) = 1-p$). This variable selects either $X$ or $Y$ with the given probability $p$ and $1-p$. Then the PMF of $Z$ is $$P(Z = k) = P(X = k, \Theta = 0) + P(Y = k, \Theta = 1)$$ $$ = P(X=k)P(\Theta = 0) + P(Y = k)P(\Theta = 1)$$ due to the independence of $\Theta$ from $X,Y$ so $$P(Z = k) = p P(X = k) + (1-p) P(Y = k)$$ The MGF is given by $$\mathbf{E}[e^{s Z}] = \sum_{k \in \chi} e^{s k} P(Z = k) = \sum_{k} p e^{s k} P(X = k) + (1-p)e^{s k} P(Y = k)$$ where $\chi$ is the set of possible values of $Z$, so $$\mathbf{E}[e^{s Z}] = p \mathbf{E}[e^{s X}] + (1-p) \mathbf{E}[e^{s Y}]$$