[Math] Maximum likelihood estimator for uniform distribution $U(-\theta, 0)$

maximum likelihoodstatistics

Consider $X_1,X_2,…,X_n$ i.i.d $U(-\theta,0)$.

I want to find the maximum likelihood estimator of $\theta$.

I know that $f(x,\theta)=\frac{1}{\theta}$ for $-\theta < x < 0$ and that $L_n(\theta, x)= \frac{1}{\theta^n}$.

If we were looking at $U(0,\theta)$, then the MLE of $\theta$ would be $x_{(n)}$ because $L_n(\theta, x)= \frac{1}{\theta^n}$ is decreasing from $0 < x < \theta$ and would thus be maximized at the max $x_i$, which is $x_{(n)}$.

For my case, since $L_n(\theta, x)= \frac{1}{\theta^n}$ is an increasing function for $-\theta < x < 0$, then $L_n(\theta, x)= \frac{1}{\theta^n}$ will be maximized at the max $x_i$, and thus the MLE of $\theta$ will be $x_{(n)}$ as well.

I think this is correct, but it seems very silly to me that for both cases you can just say that it will be maximized at the max $x_i$. Could someone better explain this to me?

Best Answer

Note that the likelihood function is a function of $\theta$. In particular, $$L_n(\theta;\vec X) = \left \{ \begin{matrix}\frac{1}{\theta^n} & \text{if $\theta \ge -X_i$ for $i=1,2,\cdots, n$,} \\ 0 & \text{otherwise.}\end{matrix}\right.$$ Here $\theta \ge -X_i$ comes from $-\theta \le X_i$. Now, $L_n(\theta,\vec X)$ is a decreasing function of $\theta$. Consequently, $L_n(\theta;\vec x)$ attains its maximum when $\theta = \max\{-X_i\}=-\min\{X_i\}=-X_{(1)}$.

Related Question