[Math] Probability generating function of negative binomial distribution proof

expectationnormal distributionprobabilityprobability distributionsproof-verification

So the textbooks says:

Let $X_r$ ~ $NB(r,p)$. We could use the probability generating functions to prove that

$G_{X_r} (s) = (\frac{ps}{1-(1-p)s})^r$:

Let $X$ have the Geometric distribution with success probability $0 < p < 1$.

Then $pk := (1 − p) ^{k−1} p$ and $G_s(s) =\sum_{k=1}^\infty $ $(1-p)^{k-1}ps^k$ = ps$\sum_{k=0}^\infty $$((1-p)s)^k$ = $ps/(1-(1-p)s)$

Furthermore, for a positive integer $r > 0$, let $X$ have the Negative Binomial distribution with parameter $(p,r)$.

Then, for $X_1, . . . , X_r$ i.i.d. Geometric$(p)$ random variables,

$G_{X_r} (s) = G_{X_1, . . . , X_r}$= $(\frac{ps}{1-(1-p)s})^r$

But it also says: We could use conditional expectation to find a recurrence relation between $G_{X_r} (s)$and $G_{X_{r-1}} (s)$ to prove.

But I don't see how to use the conditional expectation to prove the problem?

Best Answer

We can use conditional expectation as follows:

\begin{align} G_{X_r}(s) &= E\left(s^{X_r}\right) \\ &= \sum_{n=1}^{\infty} P(\text{$n^{th}$ trial is first success}) E\left(s^{X_r} \mid \text{$n^{th}$ trial is first success}\right) \\ &= \sum_{n=1}^{\infty} q^{n-1}p E\left(s^{n+X_{r-1}}\right) \qquad\qquad\qquad\text{where $q=1-p$} \\ &= E\left(s^{X_{r-1}}\right) \dfrac{p}{q} \sum_{n=1}^{\infty} (qs)^n \\ &= G_{X_{r-1}}(s) \dfrac{p}{q} \dfrac{qs}{1-qs} \\ &= \dfrac{ps}{1-qs} G_{X_{r-1}}(s) \qquad\qquad\qquad\qquad\text{is the recurrence relation} \\ &= \left(\dfrac{ps}{1-qs}\right)^{r-1} G_{X_{1}}(s) \\ &= \left(\dfrac{ps}{1-qs}\right)^{r} \qquad\qquad\qquad\text{since $X_1\sim$ Geom$(p)$ and has pgf $\dfrac{ps}{1-qs}$}. \\ \end{align}

Related Question