Solved – Rao-Blackwell unbiased estimator binomial distribution

mathematical-statisticsprobabilitysamplingunbiased-estimator

I have the iids $\ X_1,X_2, … , X_n$ with pmf $\ P(X_i = x_i) = {{m}\choose{x_i}}\theta^{x_i}(1-\theta)^{m-x_i}, 0 \leq x_i \leq m$

I have the unbiased estimator $\ X_1/m$, the sufficient statistic $T = \sum X_i $ and I want to use Rao-Blackwell to find another unbiased estimator for $\theta$.

So I do $\theta' = E(X_1/m \vert {T}) $

$ = 1/m \sum k P(X_1 = k \vert T = t) = 1/m \sum k{{m}\choose{k}}{{mn-m}\choose{t-k}}/{{mn}\choose{t}} $

But then I can't evaluate this sum. Have I gone wrong?

Best Answer

Update based on whuber's comment.

First some notation. Let $T_{-1} = \sum_{i=2}^nX_i$ and note that $T \sim Binom(nm, \theta)$ and $T_{-1} \sim Binom((n-1)m, \theta)$. Moreover, note that $X_1$ and $T_{-1}$ are independent.

\begin{align*} \phi(T) &= E(X_1/m |T =t) \\ &= \frac{1}{m}E(X_1|T=t) \\ &= \frac{1}{m}\sum_{x=0}^m xP(X_1=x|T=t) \\ &= \frac{1}{m}\sum_{x=0}^m x\frac{P(X_1=x \cap T=t)}{P(T=t)} \\ &= \frac{1}{m}\sum_{x=0}^m x\frac{P(X_1=x \cap T_{-1}=t-x)}{P(T=t)} \\ &= \frac{1}{m}\sum_{x=0}^m x\frac{P(X_1=x)P(T_{-1}=t-x)}{P(T=t)} \\ &= \frac{1}{m}\sum_{x=0}^m x\frac{\binom{m}{x}\theta^x(1-\theta)^{m-x}\binom{(n-1)m}{t-x}\theta^{t-x}(1-\theta)^{(n-1)m-t+x}}{\binom{nm}{t}\theta^t(1-\theta)^{nm-t}} \\ &= \frac{1}{m}\sum_{x=0}^m x\frac{\binom{m}{x}\binom{nm -m}{t-x}}{\binom{nm}{t}} \\ &= \frac{1}{m}\sum_{x=0}^m x f(x;nm, m, t) \quad\text{where $f$ is the pmf of a hypergeometric random variable}\\ &= \frac{1}{m}E(X) \quad \text{where $X$ is a hypergeometric rv} \\ &= \frac{1}{m}\frac{tm}{mn} = \frac{t}{mn} \end{align*}

Recalling that $t$ is the value of $T$, we get $\hat\theta_{UMVUE} = \frac{T}{nm}$ as expected.

Related Question