Probability – Finding Probability Using Moment-Generating Functions

probability

I'm working through Schaum's Outline of Probability, Random Variables, and Random Processes, and am stuck on a question about moment-generating functions. If anyone has the 2nd edition, it is question 4.60, part (b).

The question gives the following initial information:
$$E[X^k] = 0.8$$ for k = 1, 2, …

The moment-generating function for this is the following:
$$0.2 + 0.8\sum_{k=0}^\infty\frac{t^k}{k!} = 0.2 + 0.8e^t$$

The question is asking to find $P(X=0)$ and $P(X=1)$. The answers are given, $P(X=0)=0.2$ and $P(X=1)=0.8$, but I'm not seeing how the book arrived at these answers.

Using the definition of moment-generating functions, I see that the following equation is utilized: $$\sum_{i}e^tx_i*p_X(x_i) = 0.2 + 0.8\sum_{k=0}^\infty\frac{t^k}{k!} = 0.2 + 0.8e^t $$

But I'm not seeing how the $p_X(x_i)$ is extracted from that equation.

Any help is greatly appreciated. Thanks.

Best Answer

Note that the distribution of the random variable $Y$ which is $0$ with probability $0.2$ and $1$ with probability $0.8$ has the same mgf: to check, compute. Now use uniqueness.

Like in the case of the Laplace Transform, we often recognize an mgf as being a familiar one, and thereby identify a distribution.

Detail: Let $Y$ be a Bernoulli random variable which is $0$ with probability $a$, and $1$ with probability $b$. Then $$E(e^{tY})=a e^{(0)t}+be^{(1)(t)}=a+be^t.$$ Conversely, by the uniqueness theorem, a random variable whose distribution has mgf $\,a+be^t$ is a Bernoulli random variable, taking on the value $0$ with probability $a$, and $1$ with probability $b$.

We have found that our random variable $X$ has distribution with mgf $\,0.2+0.8e^t$. Therefore $X$ must be a Bernoulli random variable, with $\Pr(X=0)=0.2$ and $\Pr(X=1)=0.8$.