[Math] UMVUE Geometric Distribution

self-learningstatistical-inference

I am trying to find the UMVUE for the parameter $p$ for an n i.i.d geometric distribution:

$(1-p)^{x-1}p$ for $x=1,2,…$ and $0<p<1$

and found that:

$P(X_1=1)$ is an unbiased estimator , so let $w=I[X_1=1]$ be my unbiased estimator and since $\sum_i X_i=t$ is complete and sufficient statistic for geometric distribution, I can improve my unbiased estimator as follows:

$E[w\mid\sum_i X_i=t] = P(X_1=1\mid\sum_i X_i=t) = \frac{P(X_1=1,\sum_i X_i=t-1)}{P(\sum_i X_i=t)}$

So I have two questions now:
what is the pdf for $\sum_i X_i=t-1$ ? .. I know it is negative binomial but can't write it correctly
and my second question is what is the variance of this modified unbiased estimator and does it achieve the Cramer-Rao lower bound ?

Best Answer

For the first question only:

$P(X_1=1)=p$

$P(\sum_{i=2}^{n}X_i=t-1)={t-2\choose n-2}p^{n-1}(1-p)^{t-n}$, $t=n,n+1...$

$P(\sum_{i=1}^{n}X_i=t)={t-1\choose n-1}p^{n}(1-p)^{t-n}$, $t=n,n+1...$

So the UMVUE is $\hat p=\frac{n-1}{\sum_{i=1}^{n} X_i-1}$

For CRLB you may look here.

But for the variance of the UMVUE:

$Var(\hat p)=\sum_{t=n}^\infty \left(\frac{n-1}{t-1}-p\right)^2 {t-1\choose n-1}p^n(1-p)^{t-n}$

I'm afraid I was not able to get a closed form. Neither it worked for $E(\hat p^2)$

Maybe somebody else can step in.