Bias Correction for MLE – Mean of Geometric Random Variable

biasgeometric-distributionmaximum likelihood

Parameter estimation [ edit]
For both variants of the geometric distribution, the parameter $p$ can be estimated by equating the expected value with the sample mean. This is the method of moments, which in this case happens to yield maximum likelihood estimates of $p .{ }^{[7][8]}$
Specifically, for the first variant let $k=k_{1}, \ldots, k_{n}$ be a sample where $k_{i} \geq 1$ for $i=1, \ldots, n$. Then $p$ can be estimated as
$$
\hat{p}=\left(\frac{1}{n} \sum_{i=1}^{n} k_{i}\right)^{-1}=\frac{n}{\sum_{i=1}^{n} k_{i}} .
$$

For either estimate of $\hat{p}$ using Maximum Likelihood, the bias is equal to
$$
b \equiv \mathrm{E}\left[\left(\hat{p}_{\mathrm{mle}}-p\right)\right]=\frac{p(1-p)}{n}
$$

which yields the bias-corrected maximum likelihood estimator
$$
\hat{p}_{\mathrm{mle}}^{*}=\hat{p}_{\mathrm{mle}}-\hat{b}
$$

I think their bias correction is wrong.

Best Answer

I'll use a second order Taylor series to investigate the claim. Let me know if I have made any mistakes.

$n/\sum k_i=g(\sum k_i)\approx g(n/p) + \Big(\sum k_i-n/p\Big)\frac{dg(\sum k_i)}{d\sum k_i}\Bigg|_{\sum k_i=n/p}$ $\hspace{50mm} + \frac{1}{2!}\Big(\sum k_i-n/p\Big)^2\frac{d^2g(\sum k_i)}{d(\sum k_i)^2}\Bigg|_{\sum k_i=n/p}$

$E\Big[n/\sum k_i\Big]\approx p + \text{Var}\Big[\sum k_i\Big]\frac{n}{(n/p)^3}$

$\hspace{25mm}=p + \frac{n(1-p)}{p^2}\frac{n}{(n/p)^3} $

$\hspace{25mm}=p + \frac{p(1-p)}{n} $

$E\Big[n/\sum k_i-p\Big]\approx\frac{p(1-p)}{n}$

Related Question