Statistics – Finding MLE of f(x; ?) = (? + 1)x^?

log likelihoodmaximum likelihoodparameter estimationstatistics

Let $X_1, \cdots, X_n$ be a random sample from the PDF: $f(x;\theta) = (\theta + 1) x^{\theta}$ with $0<x<1$ and $\theta > -1$.

The likelihood function is:
\begin{align}
L(\theta) &= f(x_1, \cdots, x_n; \theta) \mathbb{1}\{0<x<1\} \mathbb{1}\{\theta >-1\}\\
&= \prod_{i=1}^{n}{f(x_i ; \theta)}\mathbb{1}\{0<x<1\} \mathbb{1}\{\theta >-1\}\\
&= \prod_{i=1}^{n}{(\theta+1)x_i^{\theta}}\mathbb{1}\{0<x<1\} \mathbb{1}\{\theta >-1\} \\
&=(\theta+1)^n \left( \prod_{i=1}^n{x_i}\right)^\theta
\end{align}

We look at the log likelihood $l(\theta)$ and take the derivative of this since it is easier to deal with and it is allowed because the log is monotonic:

\begin{align}
l(\theta) = n\log(\theta+1) + \theta \left( \sum_{i=1}^n{\log(x_i)} \right)
\end{align}

$\implies \frac{d}{d\theta} = \frac{n}{\theta+1} + \sum_{i=1}^n{\log(x_i)} = 0$

$\implies \hat{\theta}_{MLE} = -\frac{n}{\sum_{i=1}^n{\log(x_i)}} – 1$

However, this doesn't look right.

Best Answer

Note that because each $x_i \in (0,1)$, then $\log x_i \in (-\infty, 0)$, hence $$-\frac{n}{\sum \log x_i} \in (0, \infty).$$ It follows that $\hat \theta_{MLE} \in (-1, \infty)$, as desired. There is no issue.

Also note that your computation of the likelihood should be more precisely written as $$\mathcal L(\theta \mid \boldsymbol x) = \prod_{i=1}^n (\theta+1) x_i^\theta \mathbb 1 (0 < \color{red}{x_i} < 1) \mathbb 1 (\theta > -1).$$ This in turn is equivalent to $$\mathcal L (\theta \mid \boldsymbol x) = (\theta+1)^n \left(\prod_{i=1}^n x_i \right)^\theta \mathbb 1(0 < x_{(1)}) \mathbb 1(x_{(n)} > 1) \mathbb 1 (\theta > -1).$$

Related Question