Solved – How to find MLE when samples depend on the estimated parameter

estimationself-study

Can you show me what I'm doing wrong here? This is the homework problem:

Consider a random sample $Y_1, \ldots , Y_n$ from the pdf $f_Y(y;\theta) = 2y\theta^2$ where $0\le y \le \frac{1}{\theta}$. Find the maximum likelihood estimator for $\theta$.

The likelihood function for this pdf is $\prod_{i=1}^{n} f_Y(y;\theta) = \prod_{i=1}^{n} 2y_{i}\theta^2 = 2^n \theta^{2n} \prod_{i=1}^{n} y_i$.

So to me, it looks like I can maximize the likelihood function by choosing $y_{max}$ for the MLE. But then what do I do about the condition $0\le y \le \frac{1}{\theta}$?

I tried looking up how to do this but the only example I can find is the one about the uniform distribution, but that one is much more straightforward because the condition on y is just $0 \le y \le \theta$ and not $\frac{1}{\theta}$. To maximize $\frac{1}{\theta}$ I would choose my MLE to be $y_{min}$ but that wouldn't maximize the likelihood function.

Any help would be GREATLY appreciated.

Best Answer

I suggest you draw the likelihood as a function of $\theta$, without forgetting that $1/\theta$ must be greater than any observation (i.e. what are the bounds on $\theta$?).

Keep in mind that everything but $\theta^{2n}$ in the likelihood is a constant, and so you can write it as $c.\theta^{2n}$; so just draw $\cal{L}/c$ over the domain of $\theta$. You may find it more convenient to deal with the log-likelihood, or you may not.

It should help you clarify what you're doing.

If you're still stuck, consider thinking in terms of $\psi = 1/\theta$ and then go back to doing it in terms of $\theta$.

(Alternatively - look back at that uniform example. What would the MLE of $\theta$ be if the data were uniform on $[0,1/\theta]$? Can you see how to do the original problem now?)

Related Question