[Math] Finding the MLE for $θ$ given a probability density function $f(y |\theta )$.

probabilityprobability distributionsprobability theorystatistics

Let $Y_1$, $Y_2$, . . . , $Y_n$ denote a random sample from the probability density function
$$f (y | θ)=\begin{cases} (θ + 1)y^θ, & 0 < y < 1; θ > −1,\\ 0 ,& \text{elsewhere.}\end{cases}$$
Find the MLE for $θ$.

My approach is first
$$L(\theta) = (\theta+1)^n \biggl(\prod_{i=1}^n y_i\biggr)^{\!\theta}$$
$$\ln L(\theta) = n \ln (\theta+1) + \theta \sum_{i=1}^n \ln y_i$$

Taking the derivative of it, we got $$\frac{d \ln L(\theta)}{d \theta} = n\frac{1}{\theta+1} + \sum_{i=1}^n \ln y_i$$

Setting it to zero, we got $$\frac{n}{\theta+1} = – \sum_{i=1}^n \ln y_i $$

Then $$\hat{\theta} = \frac{-n}{\sum_{i=1}^n \ln y_i} – 1$$

However, the answer is $$\hat{\theta} = \frac{-n}{\sum_{i=1}^n \ln y_i}$$

I am wondering what part of my answer is wrong. Could someone please help figure it out?

Best Answer

Your answer is correct; the provided answer is the MLE of $\theta + 1$, not $\theta$. You can see this either through simulation, or by direct calculation: if we let $$s = -\sum_{i=1}^n \log y_i > 0,$$ then $$\hat \theta = -1 + \frac{n}{s},$$ and $$\ell(\hat\theta \mid s) = n \log(\hat \theta + 1) - \hat\theta s = s + n \log \tfrac{n}{s} - n.$$ But $$\ell(\hat \theta + 1 \mid s) = n \log \left(\tfrac{n}{s} + 1\right) - n$$ So their difference is $$d(s,n) = \ell(\hat\theta \mid s) - \ell(\hat\theta + 1 \mid s) = s + n \log \frac{n}{n+s}.$$ It is easy to show that this expression is always positive for $s > 0$ and $n = 1, 2, 3, \ldots$: its derivative with respect to $s$ is $\frac{s}{n+s} > 0$, so it is strictly increasing, and $d(0,n) = 0$. This proves that the estimator $n/s$ yields a strictly smaller log-likelihood than your correct MLE $\hat \theta = -1 + n/s$.

Related Question