Completeness, UMVUE, MLE in uniform $(-\theta,2\theta)$ distribution

maximum likelihoodparameter estimationprobability distributionsstatistical-inferencestatistics

Let $\theta >0$ be a parameter and let $X_1,X_2,\ldots,X_n$ be a random sample with pdf $f(x\mid\theta)=\frac{1}{3\theta}$ if $-\theta \leq x\leq 2\theta$ and $0$ otherwise.

a) Find the MLE of $\theta$

b) Is the MLE a sufficient statistic for
$\theta$?

c) Is the MLE a complete statistic for $\theta$?

d) Is $\frac{n+1}{n}\cdot MLE$ the UMVUE of $\theta$?

I've been able to solve a). The MLE of $\theta$ is $\max(-X_{(1)},\frac{X_{(n)}}{2}).$ Also, you can show that it is sufficient using the Factorization Theorem.

However, I cannot solve the next questions I think because of the $\max$ in the MLE. Is there another way to express $\max(-X_{(1)},\frac{X_{(n)}}{2})$? Can I express the MLE as $\frac{|X|_{(n)}}{2}?$

Best Answer

Find the distribution of $T=\max\left(-X_{(1)},\frac{X_{(n)}}{2}\right)$.

For $0<t<\theta$, we have \begin{align} P_{\theta}(T\le t)&=P_{\theta}\left(-t\le X_{(1)},X_{(n)}\le 2t\right) \\&=P_{\theta}\left(-t\le X_1,X_2,\ldots,X_n\le 2t\right) \\&=\left\{P_{\theta}\left(-t<X_1<2t\right)\right\}^n \\&=\left(\frac{t}{\theta}\right)^n \end{align}

So $T$ has density

$$f_T(t)=\frac{nt^{n-1}}{\theta^n}\mathbf1_{0<t<\theta}$$

In other words, $T$ is distributed exactly as $Y_{(n)}$ where $Y_1,\ldots,Y_n$ are i.i.d $U(0,\theta)$ variables.

So studying the properties of $T$ as an estimator of $\theta$ reduces to studying the properties of $Y_{(n)}$.

That $T$ is a (minimal) complete statistic is proved in detail here. And by Lehmann-Scheffe theorem, $\left(\frac{n+1}{n}\right)T$ is indeed the UMVUE of $\theta$.