[Math] Finding UMVUE for uniform distribution $U(\alpha, \beta)$

estimationstatistics

Let $X = (X_1, X_2, \ldots, X_n)$ be a sample from uniform distribution $U(\alpha, \beta): \alpha, \beta \in \mathbb{R}, \alpha < \beta$. I am to find UMVUE for the parameters $\alpha, \beta$.

Using factorization theorem I showed that $T(X) = (\min\{ X_1, X_2, \ldots, X_2\}, \max\{ X_1, X_2, \ldots, X_n \})$ is a sufficient statistics.

I think that I should use Lehmann–Scheffé theorem. Am I to solve this problem fixing $\beta$ and then computing UMVUE for $\alpha$? Then vive versa?

I tried to find an unbiased estimator for $\alpha$ first. Thinking of $\beta$ as fixed I managed to calculate that
$$\mathbb{E}(2X_1 – \beta) = \alpha.$$
Thus using Lehmann–Scheffé theorem UMVUE would be
$$\mathbb{E}(2X_1 – \beta|T(X)).$$
How can I find conditional expected value when $T(X)$ is a vector?

On the other hand I fixed $\beta$ in order to find my unbiased estimator so should I calculate $\mathbb{E}(2X_1 – \beta|X_{(1)})$?

I am a bit confused. I would appreciate any hints or tips.

Best Answer

$T=(X_{(1)},X_{(n)})$ is not only sufficient but a complete sufficient statistic which is needed here, $X_{(k)}$ being the $k$th $\,(1\le k\le n)$ order statistic.

Since the $X_i$'s are i.i.d $U(a,b)$ variables, $Y_i=\frac{X_i-a}{b-a}$ are i.i.d $U(0,1)$ variables, $\, 1\le i\le n$.

Now it is well-known that $Y_{(1)}\sim \text{Beta}(1,n)$ and $Y_{(n)}\sim\text{Beta}(n,1)$, implying $E(Y_{(1)})=\frac{1}{n+1}$ and $E(Y_{(n)})=\frac{n}{n+1}$. So all you have to do is solve for $a$ and $b$ from the equations

$$E(X_{(1)})=\frac{b-a}{n+1}+a\\ E(X_{(n)})=\frac{(b-a)n}{n+1}+a$$

You would get $a$ and $b$ as unbiased estimators of some function of $T$, and those will be the corresponding UMVUEs by Lehmann-Scheffe theorem.

Related Question