UMVUE of $\frac{p}{1-p}$ when $X\sim bin(n,p)$

binomial distributionexpected valueparameter estimationstatistical-inferencestatistics

Uniform Minimum Variance Unbiased Estimate of $\frac{p}{1-p}$ when $X\sim bin(n,p)$

Note: $Bin(n,p)$ is an one parameter exponential family member with min complete sufficient statistic $X$. Then If can find $E[T(X)]=\frac{p}{1-p}$ Then by Scheffes theorem then UMVUE, but will not achieve CRB because no linear function of X can produce an unbiased estimate of $\frac{p}{1-p}$.

Then:

$$E[T(X)]=\sum_{t=0}^{n}T(t){n\choose t}p^t(1-p)^{n-t}=\frac{p}{1-p}$$
$\Longrightarrow$
$$\sum_{t=0}^{n}T(t){n\choose t}p^{t-1}(1-p)^{n-(t-1)}=1$$

If we let $$T(t)=\frac{{n\choose t-1}}{{n\choose t}}$$

the desired equality follows then

$$T(X)=\frac{{n\choose X-1}}{{n\choose X}}$$

is the UMVUE. Is my logic correct? Is there another way to find the UMVUE?

Best Answer

I think the UMVUE does not exist for $\frac{p}{1-p}$.

$$E[T(X)]=\sum_{t=0}^{n}T(t){n\choose t}p^t(1-p)^{n-t}=\frac{p}{1-p}$$

$$\sum_{t=0}^{n}T(t){n\choose t}(\frac{p}{1-p})^t(1-p)^{n}=\frac{p}{1-p}$$

$$\sum_{t=0}^{n}T(t){n\choose t}(\frac{p}{1-p})^t=\frac{p}{1-p}*\frac{1}{(1-p)^{n}}$$

by choosing $\lambda=\frac{p}{1-p}$ $$\sum_{t=0}^{n}T(t){n\choose t}\lambda^t =\lambda *(1+\lambda)^n$$

so $\forall \lambda$ $$\sum_{t=0}^{n}T(t){n\choose t}\lambda^t =\lambda *(1+\lambda)^n$$ but it can not happen! since the max power of $\lambda$ in both side not equal.

another way: It is easy that $\frac{p}{1-p}=-1+\frac{1}{1-p}$

$\frac{1}{p}$ and $\frac{1}{q}=\frac{1}{1-p}$are not U-estimable so $\frac{p}{1-p}=-1+\frac{1}{1-p}$ is not U-estimable