Maximum Likelihood Estimation – N-th Order Statistic Significance

estimationmaximum likelihoodself-study

Let $X_1, …, X_n$ be a sample of independent,
identically distributed random variables, with density

$$ f_{\theta}(x)=e^{ (\theta -x)}$$.

$x \ge \theta$, otherwise $f_\theta = 0$

The question is: Determine the maximum likelihood estimator $\hat{\theta}_n$ of $\theta$.

I don't understand this question. What exactly does $\hat{\theta}_n$ mean? Wikipedia says something about the nth order statistic:

In statistics, the kth order statistic of a statistical sample is
equal to its kth-smallest value. Together with rank statistics, order
statistics are among the most fundamental tools in non-parametric
statistics and inference.

I tried:
$$ L(\theta)=\prod_{i=1}^ne^{ (\theta -x_i)} = e^{(n\theta – \sum_{i=1}^n{x_i})} $$

What's next?

Best Answer

Hints:

  • You have the constraint that the probability density is only positive for $x \ge \theta$ i.e. $\theta \le x$, which implies that $\hat{\theta}_n \le \min\{X_i\}$.

  • If you take the derivative of the likelihood (or the log-likelihood) with respect to $\theta$ then you ought to find the derivative is always positive

  • So?

Related Question