Solved – Show that the value is, indeed, the MLE

estimationmaximum likelihood

Let $ X_1, … X_n$ i.i.d with pdf

$$f(x;\theta)=\frac{x+1}{\theta(\theta+1)}\exp(-x/\theta), x>0, \theta >0$$

It is asked to find the MLE estimator for $\theta.$

The likelihood function is given by

$$L(\theta;x)=[\theta(1-\theta)]^{-n}\exp\left(\frac{\sum_i x_i}{\theta}\right)\prod_i (x_i+1)I_{(0,\infty)}(x_i)$$

Then, the derivative of log-likelihood will be

$$\frac{dlogL(\theta;x)}{d\theta}=\frac{-(2\theta+1)n}{\theta(1+\theta)} + \frac{\sum X_i}{\theta^2}$$

I've obtained my candidate to MLE. But, doing the second derivative of the log-likelihood, I could not conclude that it is negative so that the candidate is, indeed, the point of maximum. What should I do, in this case?

Best Answer

You actually don't have to show that it is a maximum in this case. The root of the first derivative of the log-likelihood, that is the MLE, can be shown to be unique if the iid observation are considered from a random variable of the exponential family. That is, a r.v. for which the density has the form:

$f(x;\theta) = h(x)\,\exp{(s\,\theta - K(\theta) )}$

Where $h$ is a function only of the observations $x_i$, $\theta$ is called the natural parameter, $s$ is called the natural statistics and $K$ is a function only of the natural parameter.

In this case, given $X_1,..X_n$ iid with the density you have, we have

$\prod_{i=1}^n \frac{(x_i+1)}{\theta\,(1+\theta)} \exp(-x_i/\theta) \longrightarrow \prod_i (x_i+1) exp\big(-\frac{n\overline{X}}{\theta} - \log(\theta(1+\theta)) \big)$

As this random variable belongs to the exponential family the MLE is unique.

In general, for exponential families the MLE always exists, is unique, is consistent and is asymptotically normal.

EDIT: http://www.stat.purdue.edu/~dasgupta/ml.pdf is a good explanation of this, maybe a bit too mathematical, but it depends from your academic background. Otherwise, V.S.Huzurbazar, in his "The likelihood equation, consistency and the maxima of the likelihood function" (1947) explains this theory in an easier way.

Related Question