Computing the maximum likelihood estimator

maximum likelihoodstatistical-inference

Problem: Consider a random sample of size $n$ that follows a density probability function given by:

$$f(x,\theta)=\frac{1}{\theta} x^{-\frac{\theta+1}{\theta}}\mathbb{1}_{(1,+\infty)},\:\:\theta>0$$

where $\theta$ is unknown.

1) Determine the maximum likelihood estimator of $\theta$? Is the found estimator consistent?

**2)**Provide an sufficient statistic for this model.

1)

$L=\prod_\limits{i=1}^{n}f(x,\theta)=\frac{1}{{\theta}^n} x^{-\frac{{(\theta+1)}^n}{{\theta}^n}}\mathbb{1}_{(1,+\infty)}$

In order to find $\theta$ I have to compute:

$\frac{dL}{d\theta}=0\implies\frac {d{\frac{1}{{\theta}^n} x^{-\frac{{(\theta+1)}^n}{{\theta}^n}}\mathbb{1}_{(1,+\infty)}}{}}{d\theta}=0$

However this derivative is very difficult to calculate.

For the second question I was thinking about the maximum likelihood estimator as a possible sufficient estimator.

Question:

Is there another way to find the maximum likelihood ratio besides the derivative?

Thanks in advance!

Best Answer

Your definition of likelihood is incorrect. Yours assumes each data point has the same value. Instead, each can (and does) have a different value. Replace $x$ by $x_i$ and then perform your derivative with respect to $\theta$.

$$L = \frac{1}{\theta^n} \left[ x_1^{(1-\theta)/\theta} x_2^{(1-\theta)/\theta} \cdots x_n^{(1-\theta)/\theta} \right]$$

It is simplest, though, to work with the loglikelihood...

$$l \equiv \ln L = - n \ln \theta + \frac{1-\theta}{\theta} \sum\limits_{i=1}^n \ln x_i$$

Now compute $$\frac{d l}{d \theta} = \frac{-n}{\theta} + \frac{1}{\theta (\theta - 1)} \sum\limits_{i=1}^n \ln x_i$$

and set it to zero to find:

$$\hat{\theta} = \left( \frac{1}{n} \sum\limits_{i=1}^n \ln x_i \right) + 1$$

Related Question