Maximum likelihood estimator of Cauchy distribution

statistical-inferencestatistics

The following exercise is from Bickel and Doksum, volume one.

Let $g(x) = 1/[\pi (1+x^2)]$, $x \in \mathbb{R}$, be the Cauchy density, let $X_1$ and $X_2$ be i.i.d. with density $g(x-\theta)$, $\theta \in \mathbb{R}$. Let $x_1$ and $x_2$ be observations and set $\Delta = \frac{1}{2} (x_1 – x_2)$. Let $\hat \theta = \arg \max L_X (\theta) $ be "the" MLE. Show that if $|\Delta| \leq 1$, then the MLE exists and is unique. Give the MLE when $|\Delta| \leq 1$.

The likelihood function is

$$L(\theta) = \frac{1}{\pi^2 \cdot \prod_{i = 1}^{2} (1 + (x_i – \theta)^2) }$$
I tried taking the log-likelihood and differentiating it w.r.t $\theta$, which resulted in $\hat \theta = \overline{X}$. I do not understand, however, why we need $|\Delta| \leq 1$ for uniqueness of the maximum likelihood estimator, or how to establish uniqueness of the MLE.

Best Answer

Check the derivative of the log-likelihood. It's true that the derivative $\ell'(\theta)$ equals zero at $\hat\theta:=\frac{x_1+x_2}2$. Compute the second derivative to establish that $\hat\theta$ leads to a local maximum.

However, you'll find that setting the derivative of the log-likelihood to zero yields additional solutions for $\theta$ exactly when $|\Delta|>1$. In that case it's not obvious which of these solutions is in fact the maximizer.


ADDED: For $\ell'(\theta)$ I get a numerator of $$[1+(\theta-x_1)(\theta-x_2)](\theta-x_1 + \theta-x_2).$$ The quantity in square brackets on the left is a quadratic in $\theta$. Under what conditions does this quadratic, when set to zero, yield a solution? (Hint: check the discriminant $b^2-4ac$)

Related Question