[Math] finding probability generating function and the sum of two independent random variables

generating-functionsprobabilityrandom variables

Let $X$ be a discrete random variable with probability mass function

$$P_X(x) = p(1-p)^x,\qquad x=0,1,2,3,\ldots$$

(a) Find the probability generating function for $X$ and hence find its variance.

(b) $X_1$ and $X_2$ are independent random variables with probability generating functions $e^{λ_1(t-1)}$ and $e^{λ_2(t-1)}$ respectively. show that the probability generating function for $X_1-X_2$ is $e^{(λ_1t+λ_2t^{-1})-(λ_1+λ_2)}$ and hence find its expected value.

For part (a) I think you let $1-p=q$ so the p.g.f. will be the sum of $pq^xt^x$ which will be $(1-p)/(1-pt)$. I am unsure of how to find the mean/variance from this. Do you just substitute $t$ with 1 and 2?

For part (b) I have found good proofs for $X_1 +X_2$ and have tried to use it but I get down to $e^{(λ_1(t-1))/(λ_2(t-1))}$ and am unsure if I have just gone wrong or if I can simplify to the answer because it looks close.

Best Answer

If your generating function is $G_X(t)$, you need to take derivatives with respect to $t$ and evaluate them when $t=1$. If there is a problem with the radius of convergence being $1$ (not here) you may need to take the limit as $t$ approaches $1$ from below.

So if $G(t)= \sum_x p_x t^x $

then $G'(t)= \sum_x x p_n t^{x-1} $

and $G''(t)= \sum_x x(x-1) p_n t^{x-2} $

so $E[X] = G'(1)$ and $E[X(X-1)] = G''(1)$ meaning $Var(X)= G''(1)+ G'(1) - \left( G'(1)\right)^2$. Just apply these to your probability generating functions.