[Math] Is it correct to say that the second derivative at the maximum likelihood estimator can never be positive

fixed-point-theoremsmaxima-minimamaximum likelihoodstatistics

The maximum likelihood estimator is found to be the critical value of the likelihood function, that is also the local maximum. I understand that if the function is differentiable then you can analyse the second derivative at this point and if it is negative then it is indeed the local maximum.

I was wondering if there were any cases that contradict this? I understand that some cases the function isn't differentiable or you need to calculate the hessian matrix to see that the function isn't just a saddle point. I am talking just in the simple single variable case. So is it okay to say:

If $\frac{d^2L(\hat{\theta)}}{d\theta^2} > 0$ then $\hat{\theta}$ CANNOT be the MLE of $L(\theta)$?

Best Answer

Consider an i.i.d. sample $X_1,\ldots,X_n$ from the uniform distribution on the interval $[0,\theta].$ The likelihood given $X_i=x_i$ for $i=1,\ldots,n$ is $$ L(\theta) = \begin{cases} \theta^{-n} & \text{if } \theta \ge \max\{x_1,\ldots,x_n\}. \\ 0 & \text{if } 0\le\theta < \max\{x_1,\ldots,x_n\}. \end{cases} $$ As a function of $\theta$ this is decreasing on the interval $[\max\{x_1,\ldots ,x_n\},\infty).$ It therefore has a global maximum at the left endpoint of that interval, where the first derivative $L\,'$ is negative and the second derivative is positive.

An MLE is a global maximum. The second derivative test gives at most a local maximum, and with endpoints the MLE can occur where the first derivative is not $0.$

However, suppose the parameter space contains an open interval and at some point $\theta_0$ in that open interval you have $L\,'(\theta_0)=0$ and $L''(\theta_0) > 0.$ The there is a local minimum of $L$ at $\theta_0.$

Related Question