[Math] Unbiased estimator for geometric distribution parameter p

probabilityprobability distributions

I believe that the MLE of parameter $p$ in the geometric distribution, $\hat p = 1/(\bar x +1)$, is an unbiased estimator for $p$ and would like to prove it. So far, I have:

$E[\bar x + 1] = E[\bar x] + 1 = \frac{1}{p} – 1 + 1 = \frac{1}{p}$

I can see that the relationship is likely there, but I don't know how to work with the $(\bar x + 1)$ being in the denominator.

Best Answer

$1/(\bar{X}+1)$ is not an unbiased estimator of geometric$(p)$ distribution (number of failures before the first success), but $1/\left(\frac{n}{n-1}\bar{X}+1\right)$ is. Note that $Y_n:=n\bar{X}=\sum_{i=1}^nX_i$ has negative-binomial$(p,n)$ distribution (number of failures before $n$-th success). Then $E\left[1/\left(\frac{n}{n-1}\bar{X}+1\right)\right] = E\left[\frac{n-1}{Y+n-1}\right] =$ $\sum_{y=0}^{\infty} \frac{n-1}{y+n-1} \binom{y+n-1}{n-1} p^n(1-p)^y = p\sum_{y=0}^{\infty} \binom{y+n-2}{n-2}p^{n-1}(1-p)^y=p$ (as desired) as the sum denotes the mass function of negative-binomial$(p,n-1)$ distribution.