Solved – UMVUE estimates of uniform distribution mean and width

estimationumvueuniform distribution

Given are the uniformly distributed samples
$$x_n \overset{\text{iid}}{\sim} \mathcal{U}\left(\mu-\frac{w}{2}, \mu+\frac{w}{2}\right)$$
for $n = 1 \ldots N$.Then the UMVUE estimates of $\mu$ and $w$ are
\begin{align}
\hat\mu = \frac{1}{2} \Big( \max \{ x_n \} + \min \{ x_n \} \Big)
\end{align}
and
\begin{align}
\hat{w} = \frac{N+1}{N-1} \Big( \max \{ x_n \} – \min \{ x_n \} \Big).
\end{align}
I know that this is correct but even with an extensive search I could not find a specific source for this very statement or a short proof of it. Although I found/derived a lot of related information, e.g., that $(\min \{ x_n \}, \max \{ x_n \})$ constitutes a complete sufficient statistic, the ML estimates, and relation to the german tank problem. But not this specific thing.

If you could provide me with a source or a short proof, that'd be amazing. Thank you.

Best Answer

By the Lehmann-Scheffe theorem, unbiased estimators that are functions of complete and sufficient statistics are UMVUEs. So it suffices to check that $\hat{\mu}$ and $\hat{w}$ are unbiased. This can be done by writing $X_i = w (U_i-1/2) + \mu$ where $U_i\sim Unif(0,1)$ and noting that $U_{(i)} \sim Beta(i, n-i+1)$.

Related Question