[Math] Variance of Negative Binomial Distribution (without Moment Generating Series)

negative binomialprobabilityprobability distributionsvariance

Given the discrete probability distribution for the negative binomial distribution in the form

$$P(X = r) = \sum_{n\geq r} {n-1\choose r-1} (1-p)^{n-r}p^r$$

It appears there are no derivations on the entire www of the variance formula $V(X) = \frac{r(1-p)}{p^2}$ that do not make use of the moment generating function.

I have successfully managed to compute the mean without this as follows;

\begin{align*}
\mu = \sum_{n\geq r} n{n-1\choose r-1} (1-p)^{n-r}p^r
&= \sum_{n\geq r} \frac{n(n-1)!}{(r-1)!(n-r)!}(1-p)^{n-r}p^r \\
&= \frac{r}{p} \sum_{n\geq r} \frac{n!}{r!(n-r)!}(1-p)^{n-r} p^{r+1}\\
\end{align*}
Having already factored our claimed mean of $r/p$, it remains to show that $\sum_{n\geq r} \frac{n!}{r!(n-r)!}(1-p)^{n-r} p^{r+1} = 1$ which is done by reindexing (both $r$ and $n$) and realizing this as the sum of a probability mass function for a negative binomial distribution. Indeed, letting $k = r+1$ followed by $m = n+1$, we find

\begin{align*}
\sum_{n\geq r} \frac{n!}{r!(n-r)!}(1-p)^{n-r} p^{r+1} &= \sum_{n\geq k-1}\frac{n!}{(k-1)!(n-k+1)!}(1-p)^{n-k+1}p^k\\
&= \sum_{m\geq k}\frac{(m-1)!}{(k-1)!(m-k)!}(1-p)^{m-k}p^k\\
&= \sum_{m\geq k}{m-1\choose k-1}(1-p)^{m-k}p^k = 1
\end{align*}

Does anyone know of a way to demonstrate that $\sigma^2 = V(X) = \frac{r(1-p)}{p^2}$ in this fashion?

Best Answer

No need to index twice. The process is quite similar to the way you get $E[x]$

I believe the problem here is how to get $E[x^2]$

Please refer to Markus Scheuer's efforts:

In his answer he derived: $$ \begin{align*} E(x)&=rp^r\sum_{k=0}^{\infty}\binom{k+r}{k}(1-p)^k\\ \end{align*}. $$ Follow the similar way, and apply to $E[X^2]$ $$ \begin{align*} E(x^2)&=rp^r\sum_{k=0}^{\infty}(k+r)\binom{k+r}{k}(1-p)^k\\ \end{align*} $$ We need a way to change $(k+r)$ to $(k+r-1)$, then the outside $(k+r)$ can be combined with $(k+r-1)!$ as $(k+r)!$, so I refer to the wiki page $$ \begin{align*} \binom{n}{k}=\binom{n-1}{k-1}+\binom{n-1}{k}\\ \end{align*} $$

$$ \begin{align*} (k+r)\binom{k+r}{k}&=(k+r)\binom{k+r-1}{k-1}+(k+r)\binom{k+r-1}{k}\\ &=(r+1)\binom{k+r}{k-1}+r\binom{k+r}{k}\\ \end{align*} $$ So: $$ \begin{align*} E(x^2)&=rp^r\sum_{k=0}^{\infty}(k+r)\binom{k+r}{k}(1-p)^k\\ &=rp^r\sum_{k=0}^{\infty}[(r+1)\binom{k+r}{k-1}+r\binom{k+r}{k}](1-p)^k\\ &=rp^r\sum_{k=0}^{\infty}[(r+1)\binom{k+r}{k-1}](1-p)^k+rp^r\sum_{k=0}^{\infty}[r\binom{k+r}{k}](1-p)^k\\ \end{align*} $$ then, still, use Markus Scheuer's idea in remember to use binomial series expansion

Finally, you will get $$ \begin{align*} E(x^2)&=\frac{r(1-r-p)}{p^2}\\ \end{align*} $$

$$ \begin{align*} V(X)&=E(x^2)-[E(x)]^2\\ &=\frac{r(1-r-p)}{p^2}+\frac{r^2}{p^2}\\ &=\frac{r(1-p)}{p^2} \end{align*} $$

Related Question