Solved – Is unbiased maximum likelihood estimator always the best unbiased estimator

mathematical-statisticsmaximum likelihoodunbiased-estimator

I know for regular problems, if we have a best regular unbiased estimator, it must be the maximum likelihood estimator (MLE). But generally, if we have an unbiased MLE, would it also be the best unbiased estimator (or maybe I should call it UMVUE, as long as it has the smallest variance)?

Best Answer

But generally, if we have an unbiased MLE, would it also be the best unbiased estimator ?

If there is a complete sufficient statistics, yes.

Proof:

  • Lehmann–Scheffé theorem: Any unbiased estimator that is a function of a complete sufficient statistics is the best (UMVUE).
  • MLE is a function of any sufficient statistics. See 4.2.3 here;

Thus an unbiased MLE is necesserely the best as long as a complete sufficient statistics exists.

But actually this result has almost no case of application since a complete sufficient statistics almost never exists. It is because complete sufficient statistics exist (essentially) only for exponential families where the MLE is most often biased (except location parameter of Gaussians).

So the real answer is actually no.

A general counter example can be given: any location family with likelihood $p_\theta(x)=p(x-\theta$) with $p$ symmetric around 0 ($\forall t\in\mathbb{R} \quad p(-t)=p(t)$). With sample size $n$, the following holds:

  • the MLE is unbiased
  • it is dominated by another unbiased estimator know as Pitman's equivariant estimator

Most often the domination is strict thus the MLE is not even admissible. It was proven when $p$ is Cauchy but I guess it's a general fact. Thus MLE can't be UMVU. Actually, for these families it's known that, with mild conditions, there is never an UMVUE. The example was studied in this question with references and a few proofs.

Related Question