[Math] Sufficient Statistics, MLE and Unbiased Estimators of Uniform Type Distribution

statistics

Let $X_1, \dots, X_n$ denote a random sample of size $n$ from the probability distribution with pdf:

$$ f_X(x\mid\theta_1, \theta_2) = \frac{1}{\theta_2 – \theta_1} \ I(x)_{[\theta_1,\theta_2]} \ I(\theta_1)_{(-\infty,\theta_2)} \ I(\theta_2)_{(\theta_1,\infty)}\;.$$

(1) Find a pair of sufficient statistics for $(\theta_1, \theta_2)$.

${\bf\text{My thoughts:}}$ This wasn't too bad. I got $(X_{(1)}, X_{(n)})$ for this part

(2) Find the maximum likelihood estimator $(\hat{\theta}_1, \hat{\theta}_2)$ for $(\theta_1, \theta_2)$.

${\bf\text{My thoughts:}}$ Thinking I need to use monotone functions since it has 2 parameters and the variables are part of the interval. I believe that $\frac{X_{(1)} + X_{(n)}}{2}$ will become one of my estimators.

(3) Show that $\frac{X_{(1)} + X_{(n)}}{2}$ is an unbiased estimator for $\frac{\theta_1 + \theta_2}{2}$.

$\bf{My \ thoughts:}$ I think I will need to use Cramer-Rao Lower Bound in some form but not quite sure if that is right .

(4) Construct an unbiased estimator for $\theta_2 – \theta_1$.

$\bf{My \ thoughts:}$ Very stuck on this part, but I think I can use some information from previous parts to help me.

Any help is greatly appreciated.

Best Answer

$(1)$ Correct.

$(2)$ Since the density is $1/(\theta_2-\theta_1)$, the further $\theta_1$ and $\theta_2$ are apart, the lower the probability for obtaining the given data. Thus the maximum likelihood estimate is given by the values of $\theta_1$ and $\theta_2$ that are closest together and compatible with the data, $(\hat\theta_1,\hat\theta_2)=(\def\xa{X_{(1)}}\xa,\def\xb{X_{(n)}}\xb)$.

$(3)$ This is true by symmetry – the expectation is clearly finite, and it can't be anything other than the midpoint since it could then equally well be the value reflected about the midpoint.

$(4)$ Compute the expectation of $\xb-\xa$. This must be linear in $\theta_2-\theta_1$, with the coefficient depending only on $n$, so you can scale $\xb-\xa$ up accordingly to get an unbiased estimator for $\theta_2-\theta_1$.

Related Question