Solved – Unbiased estimator with minimum variance for $1/\theta$

estimationexponential-familyprobabilityself-studyunbiased-estimator

Let$ X_1, …,X_n$ be a random sample feom a distribution $Geometric(\theta)$ for $0<\theta<1$. I.e,

$$p_{\theta}(x)=\theta(1-\theta)^{x-1} I_{\{1,2,…\}}(x)$$

Find the unbiased estimator with minimum variance for $g(\theta)=\frac{1}{\theta}$

My attempt:

Since the geometric distribution is from the exponential family, the statistics $$\sum X_i $$ is complete and sufficient for $ \theta$. Also, if $$T(X)=X_1$$ is an estimator for $g(\theta)$, it is unbiased. Therefore, by the Rao-Blackwell theorem and Lehmann-Scheffé Theorem,
$$W(X) = E[X_1|\sum X_i]$$
is the estimator we are looking for.

We have the following:

$W(X) = \sum_{i=1}^t i\, P(X_1=i|\sum X_i =t) = \sum_{i=1}^t i\, \frac{P(\sum_{i \geq 2} X_i
=t-i)P(X_1=i)}{P(\sum_{i \geq 1}X_i =t)}$

Since the variables are iid geometric, the sums distributions are both negative binomials. But i am having troubles tosimplify the binomial coefficients and give a final answer with a better form, if it is possible.I wpuld be glad if I could get some help.

Thanks!

Edit: I dont think you guys understand my doubt: Ithink I made all the correct steps, maybe only forgot some indicator function. Here is what I did:

$$…=\sum_{i=1}^ti\frac{\binom{t-i-1}{n-2}\theta^{n-i}(1-\theta)^{t-i-n+1} \theta(1-\theta)^{i-1}}{\binom{t-1}{n-1}\theta^n(1-\theta)^{t-n}}=\sum_{i=1}^t i \frac{\binom{t-i-1}{n-2}}{\binom{t-1}{n-1}}$$

As i said, I am having troubles to simplify this and with the somatory index

Best Answer

Indeed for a Geometric ${\cal G}(\theta)$ variate, $X$, $$\mathbb{E}_\theta[X]=1/\theta=g(\theta)$$and the Rao-Blackwell theorem implies that $$\hat{\theta}(T)=\mathbb{E}_\theta\left[X_1\Bigg|\sum_{i=1}^n X_i=T\right]$$is the unique minimum variance unbiased estimator. But rather than trying to compute this conditional expectation directly, one could remark that $$\mathbb{E}_\theta\left[X_1\Bigg|\sum_{i=1}^n X_i=T\right]=\ldots=\mathbb{E}_\theta\left[X_n\Bigg|\sum_{i=1}^n X_i=T\right]$$ hence that $$\mathbb{E}_\theta\left[X_1\Bigg|\sum_{i=1}^n X_i=T\right]=\frac{1}{n}\sum_{i=1}^n \mathbb{E}_\theta\left[X_i\Bigg|\sum_{i=1}^n X_i=T\right]=\frac{T}{n}$$ Note, incidentally, that, since $\sum_{j\ge 2} X_j$ is a Negative Binomial $\cal{N}eg(n-1,\theta)$ $$\mathbb{P}\left(\sum_{j\ge 2} X_j=m\right)={m-1\choose n-2}\theta^{n-1}(1-\theta)^{m-n+1}\mathbb{I}_{m>n-1}$$ hence the final sum should be $$\sum_{i=1}^{t-n+1} i {\binom{t-i-1}{n-2}}\bigg/{\binom{t-1}{n-1}}$$

Related Question