I am using the definition of the negative binomial distribution from here. This is the same definition that Matlab uses. For convenience,
$$P(k) = {r + k -1 \choose k}p^r(1-p)^k ,$$
where $p$ is the probability of success. $P(k)$ is the probability of $k$ failures before $r$ successes. The probability generating function is supposed to be,
$$ g(x) = \left(\frac{p}{1-(1-p)x}\right)^r.$$
However, I am trying to prove this. Steps:
$$g(x) = \sum^{\infty}_{k=0} P(k)\, x^k$$
$$= \sum^{\infty}_{k=0} {r + k -1 \choose k}p^r(1-p)^k \, x^k $$
$$ = p^r \sum^{\infty}_{k=0} {r + k -1 \choose k}(x(1-p))^k. $$
I suppose the next step would be to show that,
$$ \sum^{\infty}_{k=0} {r + k -1 \choose k}(x(1-p))^k = \frac{1}{(x(1-p))^r}.$$
Is there a formula (or theorem) for infinite sums involving binomial coefficients that I can apply to get this?
Best Answer
Actually, what you want to show is
$$(1-y)^{-r} = \sum_{k=0}^{\infty} \binom{k+r-1}{k} y^k$$
You can see this using the generic rule:
$$(1-y)^{-r} = 1 + (-r) (-y) + \frac{1}{2!} (-r)(-r-1) (-y)^2 + \frac{1}{3!} (-r)(-r-1)(-r-2) (-y)^3+\ldots$$
This is really just a Maclurin expansion of $(1-y)^{-r}$. Take a look at the coefficient of $y^k$:
$$\begin{align}\frac{1}{k!} (-1)^k (-r)(-r-1)\ldots(-r-k+1) &= \frac{(r+k-1)(r+k-2)\ldots(r+1)(r)}{k!}\\ \end{align}$$
Now compare that to
$$\binom{k+r-1}{k} = \frac{(k+r-1)!}{k! (r-1)!} = \frac{(r+k-1)(r+k-2)\ldots(r+1)(r)}{k!} $$
Set $y=x (1-p)$ and you are done.