[Math] Find UMVUE for p when the pdf is $f(x; p) = p(1-p)^{x – 1}$

estimationestimation-theoryparameter estimationprobabilitystatistics

Let $x \in \mathbb{N}$, and let the pdf of the random variable $X$ be $f(x; \theta) = p(1-p)^{x – 1}$. I want to find the UMVUE for $p$.

It can be shown that $f$ is in the regular exponential class and thus $S = \sum_{i=1}^n X_i$ is a sufficient and complete statistic for $p$. Thus, if I can find an unbiased estimator for $p$ involving $S$, that statistic with be a UMVUE for $p$.

Unfortunately, $E[S] = \frac{n}{p}$ Thus, $E\left[\bar X\right] = \frac{1}{p}$, and it is a UMVUE for $\frac{1}{p}$, but it does not follow that $\frac{1}{\bar X}$ is a UMVUE for $p$.

So then how can I get a UMVUE for $p$ from $S$?

Best Answer

Here's a common technique. First find any unbiased estimator for $p$. In this case, it's easy to see that,

$$ T(\textbf{X}) = \begin{cases}1 & X_1 = 1\\ 0 & otherwise \end{cases} $$

is unbiased for $p$. By conditioning on the complete sufficient statistic, $\mathbb{E}[T(\textbf{X}) | S]$ is unbiased and must be the UMVUE by Lehmann-Scheffe. Try to find a closed form for $\mathbb{E}[T(\textbf{X}) | S]$. To get you started,

$$ \mathbb{E}[T(\textbf{X}) | S = s] = \mathbb{P}(X_1 = 1 | \sum\limits_{i=1}^nX_i = s) $$