Statistics – Showing That the MLE Doesn’t Exist for e^{?-x}

exponential distributionmaximum likelihoodstatistical-inferencestatistics

There is a classic problem:

Suppose that $X_1,\ldots,X_n$ form an i.i.d. sample from a distribution with the following pdf:

$$f(x\mid\theta) =
\begin{cases}
e^{\theta-x}\quad&\text{for }\, x> \theta \\
0 &\text{otherwise}.
\end{cases}$$

I would like to show that the MLE of $\theta$ does not exist.

The argument I have is that the likelihood function will be a maximum when $\theta$ is made as large as possible subject to the strict inequality $\theta < \min\{X_1, \ldots, X_n\}$. Therefore, the value $\theta = \min\{X_1,\ldots , X_n\}$ cannot be used and there is no MLE.

However, I do not understand WHY we want $\theta$ to the equal to the maximum of the values.

Also, is there a way to show mathematically why this MLE doesn't exist?

I get that the log-likelihood function is:

$$L(\theta) = n\theta – (X_1+\ldots+X_n)$$

but when you differentiate via $\theta$ and set to $0$, we get:

$n=0$. How does the fact $n=0$ fit into the fact the MLE doesn't exist for $\theta$? Thanks!

Best Answer

You have a continuous random variable so $\theta < \min{x_i} \equiv \theta \leq \min{x_i}\rightarrow \theta = \min{x_i}$