Solved – Find the Maximum Likelihood Estimator given two pdfs

cauchy distributionmaximum likelihoodnormal distributionself-study

From the book Introduction to Mathematical Statistics by Hogg, McKean and Craig (# 6.1.12):

Let $X_1,X_2,\cdots,X_n$ be a random sample from a distribution with one of two pdfs.

If $\theta=1$, then $f(x;\theta=1)=\frac{1}{\sqrt{2\pi}}e^{-x^2/2},\,-\infty<x<\infty$.

If $\theta=2$, then $f(x;\theta=2)=\frac{1}{\pi(1+x^2)},\,-\infty<x<\infty$. Find the mle of $\theta$.

My attempt: derive the first $f$ with respect to $x$ and set it to zero. That gives me $x_{1}=0$. Replacing in the first $f$, we get $\sqrt{\frac{1}{2 \pi}}$.
Working similarly with the second $f$ we get the value of $\frac{1}{\pi}$.
The former is greater, the the final answer is $\theta = 1$.

Is that right? If not, what would be the right procedure?

Best Answer

In this problem your unknown parameter $\theta$ only has two possible values, so you have a discrete optimisation where you just have to compare the likelihood at those two parameter values. (If you are taking derivatives of something in a discrete optimisation then you are going down the wrong track.) For an observed data vector $\mathbf{x}$ you have:

$$L_\mathbf{x}(\theta) = \begin{cases} (2 \pi)^{-n/2} \exp(-\tfrac{1}{2} \sum x_i^2) & & \text{for } \theta = 1, \\[6pt] \pi^{-n} / \prod (1+x_i^2) & & \text{for } \theta = 2. \\ \end{cases}$$

Since there are only two possible parameter values, you can find the maximising parameter value by looking at the sign of the difference in likelihood at these values. You have:

$$\begin{equation} \begin{aligned} \Delta(\mathbf{x}) \equiv \text{sgn}(L_\mathbf{x}(1)-L_\mathbf{x}(2)) &= \text{sgn}\Bigg( (2 \pi)^{-n/2} \exp(-\tfrac{1}{2} \sum x_i^2) - \frac{1}{\pi^{n} \prod (1+x_i^2)} \Bigg) \\[6pt] &= \text{sgn}\Bigg( \exp(-\tfrac{1}{2} \sum x_i^2)\prod (1+x_i^2) - \Bigg( \frac{2}{\pi} \Bigg)^{n/2} \Bigg). \\[6pt] \end{aligned} \end{equation}$$

The maximum-likelihood-estimator (MLE) is:

$$\hat{\theta} = \begin{cases} 2 & & \text{if } \Delta(\mathbf{x}) = -1, \\[6pt] \{ 1,2 \} & & \text{if } \Delta(\mathbf{x}) = 0, \\[6pt] 1 & & \text{if } \Delta(\mathbf{x}) = 1. \\[6pt] \end{cases}$$

(In the case where $\Delta(\mathbf{x}) = 0$ the MLE is non-unique since the likelihood is the same at both parameter values.)