[Math] Proving MLE for normal distribution

multivariable-calculusnormal distributionstatistics

I need to prove that using maximum likelihood estimation on both parameters of normal distribution indeed maximises likelihood function.

So, the log-likelihood function for parameters $\sigma$ and $m$ is
$$ \ln L = -\frac{n}{2}\ln2 \pi – n \ln \sigma – \sum_{i=1}^n\dfrac{1}{2\sigma^2}(x_i – m)^2 $$

After differentiating we get two equations
$$ \dfrac{\partial \ln L}{\partial \sigma} = – \dfrac{n}{\sigma} + \sum_{i=1}^n\dfrac{1}{\sigma^3}(x_i – m)^2 = 0$$
$$ \dfrac{\partial \ln L}{\partial m} = \sum_{i=1}^n\dfrac{1}{\sigma^2}(x_i – m) = 0 $$

And now we get the estimators:
$$ m = \dfrac{\sum_{i=1}^n x_i}{n} $$
$$ \sigma = \sqrt{\dfrac{\sum_{i=1}^n\left(x_i – \dfrac{\sum_{i=1}^n x_i}{n}\right)^2}{n}}$$

Now I need to prove that this is local maximum. To do this I need to get second-order derivatives, and check that Hessian matrix is negative-definite.

The derivatives are
$$ A = \dfrac{\partial^2 \ln L}{\partial \sigma^2} = \dfrac{n}{\sigma^2} – \sum_{i=1}^n\dfrac{3}{\sigma^4}(x_i – m)^2 = \dfrac{-2n^2}{\sum_{i=1}^n\left(x_i – \dfrac{\sum_{i=1}^n x_i}{n}\right)^2} $$
$$ C = \dfrac{\partial^2 \ln L}{\partial m^2} = -\dfrac{n}{\sigma^2} = -\dfrac{n^2}{\sum_{i=1}^n\left(x_i – \dfrac{\sum_{i=1}^n x_i}{n}\right)^2} $$
$$ B^2 = \left(\dfrac{\partial^2 \ln L}{\partial m \partial \sigma}\right)^2 = \left(\dfrac{-2\sum_{i=1}^n (x_i-m)}{\sigma^3}\right)^2 = \dfrac{4\left[\sum_{i=1}^n\left(x_i – \dfrac{\sum_{i=1}^n x_i}{n}\right)\right]^2}{\left(\sum_{i=1}^n(x_i – m)^2\right)^3} $$

And that's where I get lost. I should prove that $AC – B^2 > 0$ but it doesn't look to me as something clearly positive.

Best Answer

Set $\frac{\sum_{i=1}^{n}x_i}{n}=\bar{x}$. Then the numerator of $B$ is $\sum_{i=1}^n x_i - \bar{x} = (\sum_{i=1}^n x_i )- n\bar{x} = n\bar{x} - n\bar{x} = 0$. Thus, $B=0$, and since $AC > 0$, we are done.

Related Question