Maximum likelihood estimator doesn’t exist

log likelihoodmaximum likelihoodprobabilitystatistics

I was reading a paper, and the last paragraph it says

For example, consider the density function $$p_{(\theta,\sigma)}(x)=\frac{1}{2\sqrt{2\pi}}e^{\large{-\frac{1}{2}(x-\theta)^2}} + \frac{1}{2\sqrt{2\pi} \sigma}e^{\large{-\frac{1}{2\sigma^2}(x-\theta)^2}} $$
of the sequence of independent and identically distributed chance variable $X_1,X_2, \dots$ Here $\theta \in \mathbb{R}$ and $\sigma > 0$. It is easy to see that the supremum of the likelihood function is almost always infinite , no MLE exists […]

So, the likelihood function would be $\prod_{i=1}^np_{(\theta,\sigma)}(x_i)$ by the i.i.d. condition, but I don't see why this wouldn't have maximum. Even the case where $\theta$ is fixed and $\sigma$ is near to $0$, the function is close to $0$. Can I get some insights please?

Best Answer

The likelihood function, as a function of the two variables $\theta$ and $\sigma$, takes on arbitrarily large values when evaluated at $\theta=x_1$ and at $\sigma$ very close to $0$. Because of the division by $\sigma$ in the second term in your formula for the density. Other values of $\theta$ close to any of the $x_i$ and with $\sigma\ge\min_i|x_i-\theta|$ will similarly give rise to large values of the likelihood.