[Math] Is the maximum likelihood estimator an unbiased estimator

maximum likelihoodstatistics

Given is a random sample $X_1, .., X_n$ drawn from a distribution with the pdf $$ f(x; \theta) = \left\{
\begin{array}{ll}
\dfrac{1}{4} e^{-\dfrac{1}{4}(x-\theta)} & \theta < x \\
0 & \text{otherwise} \\
\end{array}
\right. $$

with an unknown paramter $\theta$ $(0 < \theta)$. Find the maximum likelihood estimator (MLE) and determine whether or not it's unbiased.

So for the MLE I found $X_{1:N}$, since the derivative of the likelihood function is $\dfrac{n}{4}$ with $n>0$, which is always positive so the likelihood function is increasing, which means we should choose the maximum value of $\theta$ possible to maximise the function. Our restriction is $\theta < x$, so the maximum we can choose is $X_{1:n}$.

However, for determining its unbiasedness I'm not completely sure. We want to show I guess that $E[X_{1:n}] = \theta$, but I don't really know what to do with the expression $E[X_{1:n}]$ to be able to write it in an appropriate form including $\theta$. How should this be done?

Best Answer

It should be intuitively obvious that such an estimator is necessarily biased, because it can never be smaller than the true value of $\theta$. If it were, then you would observe $$\hat\theta_{\text{MLE}} = X_{1:n} = \min_i X_i \le \theta,$$ which is absurd. So if there is a nonzero probability that the MLE is greater than $\theta$ (which of course is the case), it must be biased since $\Pr[\hat \theta_{\text{MLE}} < \theta] = 0.$

Related Question