Maximum likelihood of log-normal distribution

log likelihoodmaximum likelihoodprobabilityprobability distributions

I am trying to find the maximum likelihood function of log-normal distribution for both parameters $\mu$ and $\sigma^2$. I've gotten the derivative of the log-likelihood for $\mu$ to be

$$\frac{\partial l}{\partial\mu}=\frac{\sum^n_{i=1}\ln x_i}{\sigma^2}-\frac{n\mu}{\sigma^2}$$

and found the maximum likelihood estimator by setting the derivative to zero which came about to be

$$\frac{\sum^n_{i=1}\ln x_i}{\sigma^2}-\frac{n\mu}{\sigma^2} = 0 \iff \hat\mu=\frac{\sum^n_{i=1}\ln x_i}{n}.$$

However, the teaching assistant of the course told me that I should explain why the maximum likelihood estimator is found when the derivative is zero and I should've set the derivative to greater or equal to instead so

$$\frac{\sum^n_{i=1}\ln x_i}{\sigma^2}-\frac{n\mu}{\sigma^2} \ge 0.$$

I thought I could figure out this by myself but to be honest, I don't understand why I should've set it to greater or equal instead.

Best Answer

However, the teaching assistant of the course told me that I should explain why the maximum likelihood estimator is found when the derivative is zero

Because when you are looking for a maximum of a function one possible procedure is to find a stationary point (where its first derivative is zero) and then test its second derivative (if it is negative the point is an argmax; it's a point whose image is a maximum)

Remember that is not always true that the MLE is found where the derivative is zero. As a simple example, find MLE for $\theta$ considering a $n$-size random sample from a uniform population $U(0;\theta)$

Related Question