[Math] MLE of Negative Binomial Distribution

maxima-minimamaximum likelihoodparameter estimationprobability

I want to find an estimator of the probability of success of an independently repeated Bernoulli experiment. Given that we have exactly $k$ failures before the $r$-th success.

The probability for $k$ failures before the $r$-th success is given by the negative binomial distribution:

$$P_p[\{k\}] = {k + r – 1 \choose k}(1-p)^kp^r$$

This yields the $\log$-Likelihood function for the observed number of failures $k$:

$$l_k(p) = \log({k + r – 1 \choose k}) + k\log(1-p) + r\log(p)$$

With derivative

$$l_k'(p) = \frac{r}{p} – \frac{k}{1-p}$$

The derivative is zero at $\hat p = \frac{r}{r+k}$. To show that $\hat p$ is really a MLE for $p$ we need to show that it is a maximum of $l_k$. But evaluating the second derivative at this point is pretty messy. Is there an easier way to show that this is in fact an MLE for $p$?

Best Answer

In general the method of MLE is to maximize $L(\theta;x_i)=\prod_{i=1}^n(\theta,x_i)$. See here for instance. In case of the negative binomial distribution we have

$$L(p;x_i) = \prod_{i=1}^{n}{x_i + r - 1 \choose k}p^{r}(1-p)^{x_i}\\$$

$$ \ell(p;x_i) = \sum_{i=1}^{n}\left[\log{x_i + r - 1 \choose k}+r\log(p)+x_i\log(1-p)\right]$$ $$\frac{d\ell(p;x_i)}{dp} = \sum_{i=1}^{n}\left[\dfrac{r}{p}-\frac{x_i}{1-p}\right]=\sum_{i=1}^{n} \dfrac{r}{p}-\sum_{i=1}^{n}\frac{x_i}{1-p}$$

Set it to zero and add $\sum_{i=1}^{n}\frac{x_i}{1-p}$ on both sides.

$$\sum_{i=1}^{n} \dfrac{r}{p}=\sum_{i=1}^{n}\frac{x_i}{1-p}$$

$$\frac{nr}{p}=\frac{\sum\limits_{i=1}^nx_i}{1-p}\Rightarrow \hat p=\frac{\frac{1}{\sum x_i}}{\frac{1}{n r}+\frac{1}{\sum x_i}}\Rightarrow \hat p=\frac{r}{\overline x+r}$$

Now we have to check if the mle is a maximum. For this purpose we calculate the second derivative of $\ell(p;x_i)$.

$$\frac{d^2\ell(p;x_i)}{dp^2}=\underbrace{-\frac{rn}{p^2}}_{<0}\underbrace{-\frac{\sum\limits_{i=1}^n x_i}{(1-p)^2}}_{<0}<0\Rightarrow \hat p\textrm{ is a maximum}$$

Related Question