Maximum Likelihood Estimation – MLE for a Normal Distribution Where Mean Equals Variance

maximum likelihood

$\newcommand{\nd}{\frac{n}{2}}$For an $n$-sample following a Normal$(\mu=\theta,\sigma^2=\theta)$, how do we find the mle?

I can find the root of the score function
$$
\theta=\frac{1\pm\sqrt{1-4\frac{s}{n}}}{2},s=\sum x_i^2,
$$
but I don't see which one is the maximum.
I tried to substitute in the second derivative of the log-likelihood, without success.

For the likelihood, with $x=(x_1,x_2,\ldots,x_n)$,
$$
f(x) = (2\pi)^{-n/2} \theta^{-n/2} \exp\left( -\frac{1}{2\theta}\sum(x_i-\theta)^2\right),
$$
then, with $s=\sum x_i^2$ and $t=\sum x_i$,
$$
\ln f(x) = -\nd \ln(2\pi) -\nd\ln\theta-\frac{s}{2\theta}-t+\nd\theta,
$$
so that
$$
\partial_\theta \ln f(x) = -\nd\frac{1}{\theta}+\frac{s}{2\theta^2}+\nd,
$$
and the roots are given by
$$
\theta^2-\theta+\frac{s}{n}=0.
$$
Also,
$$
\partial_{\theta,\theta} \ln f(x) = \nd \frac{1}{\theta^2} – \frac{s}{\theta^3}.
$$

Best Answer

There are some typos (or algebraical mistakes) in the signs of the log-likelihood, followed by the corresponding unpleasant consequences.

Since this is a well-known problem, I will only point out a reference with the solution:

Asymptotic Theory of Statistics and Probability pp. 53, by Anirban DasGupta.

Related Question