[Math] Find UMVUE in a uniform distribution setting

parameter estimationstatistical-inferencestatisticsuniform distribution

Let $X_1, …, X_n$ be independent and uniformly distributed on $(\theta_1-\theta_2,\theta_1+\theta_2)$ for $\theta_1 \in \mathbb{R}$, $\theta_2>0$.

Find UMVUEs for

a) $\theta_1$ and $\theta_2$,

b) $\theta_1/\theta_2$.

Naturally, I would like to use the Lehmann-Scheffé theorem that says:

If $V$ is a complete, sufficient statistic for $\theta$ and $\mathbb{E}_\theta[g(V)]=h(\theta)$ holds. Then $g(V)$ is an UMVUE for $h(\theta)$.

So first, I have to find such a statistic for any of my $\theta$s.

In the lecture we learned that $(X_{(1)}, X_{(n)})$ sufficient for $(a,b)$ if the $X_i$ are uniform on $(a,b)$ (where $X_{(1)}<…<X_{(n)}$ is the order statistic for $X_1, …, X_n$).

So it follows that in my setting $(X_{(1)}, X_{(n)})$ is sufficient for $(\theta_1-\theta_2,\theta_1+\theta_2)$? But looking at the proof, $(X_{(1)}, X_{(n)})$ is already a sufficient statistic for $(\theta_1, \theta_2)$, too, right?

So I have to check for completeness and thus show that for all functions $g$ mapping from the range of $V$ to $\mathbb{R}$ from $\mathbb{E}_\theta[g(X_{(1)}, X_{(n)})]=0$ for all $\theta=(\theta_1, \theta_2)$ follows $\mathbb{P}_\theta(g(X_{(1)}, X_{(n)})=1)=1$ $\mathbb{P}_\theta$-almost-surely.

Here I'm stuck: How can I show that?

Best Answer

I don't think you can. At least if X is distributed on $(\theta,\theta + 1)$, then $(X_{(1)}, X_{(n)})$ is sufficient, but not complete.

Lets say $g(X_{(1)}, X_{(n)})$ is a function of your statistic. Then

$\mathbb{E}_{\theta}[X_{(n)} - X_{(1)}] = \mathbb{E}_{\theta}[X_{(n)}] - \mathbb{E}_{\theta}[ X_{(1)}] = (\theta + n/(n+1)) - (\theta + 1/(n+1)) = (n-1)/(n+1)$

Then $g$ defined as

$g(X_{(1)}, X_{(n)}) := X_{(n)} - X_{(1)} - (n-1)/(n+1)$

would be 0 $\forall \theta$ and thus not complete.

As I said, the setting is slightly different though. Let me know if you find a definite solution - I'm looking at the same problem.