[Math] UMVUE for $\theta^2$

conditional-expectationprobabilitystatistical-inferencestatistics

Let $X_1,…X_n$ be a random sample with distribution $\text{Normal}(\theta,1)$. Find the UMVUE for $\theta^2$

What I´ve done so far:

I have already shown that $T=\sum_{i=1}^nX_i$ is a complete sufficient statistic for $\theta$. Let $\widehat{\theta^2}=\bar {X}^2-\frac{1}{n}$ be an estimator for $\theta^2$

$$E[\widehat{\theta^2}]=E[\bar {X}^2-\frac{1}{n}]=E[\bar{X}^2]-E^2[\bar X]+E^2[\bar X]+\frac{1}{n}=Var(\bar X)+E^2[\bar X]+\frac{1}{n}$$

I know that $\bar X$ has distribution $\text{Normal}(\theta, \frac{1}{n})$ It follows that :$$E[\widehat{\theta^2}]=\theta^2$$
Hence $\widehat{\theta^2}=\bar {X}^2-\frac{1}{n}$ is unbiased estimator

Know I have to compute $g(T)=E[\widehat{\theta^2}|T]$ and by Lehmann-Scheffé this will be the UMVUE for $\theta^2$ but the problem is how can I compute $$E[\widehat{\theta^2}|T]=E[\bar {X}^2-\frac{1}{n}|\sum_{i=1}^nX_i]$$ I know I need to find the joint density of $\bar {X}^2-\frac{1}{n}$ and $\sum_{i=1}^nX_i$ but is there an easy way to do it?

Or is there another way to find the UMVUE? I would really appreciate if you can help me with this problem

Best Answer

Generally finding UMVUEs can be really tedious. Just read a little bit about the concept of 'Ancillary statistics' and 'Basu's Theorem' which greatly simplifies the math. You will comfortably handle the problem.

To give you an idea

If $T$ and $V$ are the unbiased estimator and complete sufficient statistic of a parameter $\theta$ respectively, then $S=\dfrac{T}{V}$ is referred to as the the ancillary statistic of $\theta$ and the UMVUE of $\theta$ is then given by

$E(T|V)=E(T(\dfrac{V}{V})|V)=E(SV|V)=VE(S)\quad\mbox{Moreover }\quad E(S)=\dfrac{E(T)}{E(V)}$

For more information please refer to the following source

Apllications of Basu's Theorem, Boos and Hughes-Oliver (1998)