Statistics – Maximum Likelihood When Usual Procedure Doesn’t Work

maximum likelihoodstatistical-inferencestatistics

I am trying to get the maximum likelihood estimate for the parameter $p$. The distribution is the following:

$$ f(x\mid p) = \begin{cases}
\frac{p}{x^2} &\text{for} \ p\leq x < \infty \
\\ 0 &\text{if not}
\end{cases} $$

The sample has size $n$.

The problem is, when I try to estimate it by the procedure I know, I would have to estimate the likelihood function and obtain the derivative of the log-likelihood. We'd have:

$$ L(p; x) = \frac{p^n}{\prod_{i=1}^{n} x^2_i}$$
$$ \ln L(p;x) = n \ln(p) – \sum_{i=1}^{n} \ln(x_i^2) $$

For the derivative:

$$ l'(p;x) = \frac{n}{p} = 0$$

And I am stuck because it has no solution for $p$. How do I evaluate this?

Thanks!

EDIT:

So in this case I can use the indicator variable to write:

$$ L(p;x) = \frac{p^n}{\prod_{i=1}^{n}x_i^2} I_{(x_i \geq p)}$$

for $i = 1,2, \ldots n$ in the indicator variable. So the "closest" non-null value of $x \in X$ to $p$ is $min(X_1, \ldots X_n)$. Is that the point?

Best Answer

The usual method does not work well when the support of the random variable (in this case, $[p, \infty)$) depends on the parameter of interest (which is $p$ in this case).

In these situations, you should use indicator functions. Let $\mathbf{I}$ denote the indicator function, defined by $$\mathbb{I}(\cdot) = \begin{cases} 1, & \cdot \text{ is true} \\ 0, & \cdot \text{ is false.} \end{cases}$$ Thus, we may write $$f(x \mid p) = \dfrac{p}{x^2}\mathbf{I}(p \leq x)\text{.}$$

(Please read this other answer for details that I will leave unproven here, and for a similar problem to this one.)

Per the link I've put above, you can see that $$L(p \mid \mathbf{x}) = \prod_{i=1}^{n}\dfrac{p}{x_i^2}\mathbf{I}(p \leq x_i)=\dfrac{p^n}{\prod_{i=1}^{n}x_i^2}\mathbf{I}(p \leq x_{(1)})$$ where $x_{(1)} = \min\limits_{1 \leq i \leq n}x_i$.

Viewing this as a function of $p$, note that if $p > x_{(1)}$, then $\mathbf{I}(p \leq x_{(1)}) = 0 = L(p \mid \mathbf{x})$, which is obviously not the largest value of $L$.

Thus, assume $p \leq x_{(1)}$. Disregarding constants of proportionality with respect to $p$ (which do not affect the actual maximum likelihood estimator), we obtain $$L(p \mid \mathbf{x}) = \dfrac{p^n}{\prod_{i=1}^{n}x_i^2}\mathbf{I}(p \leq x_{(1)}) \propto p^n\text{.}$$

As long as $p > 0$, we know that $p^n$ (for $n$ fixed) is indeed a monotonically increasing function of $p$. Thus, to maximize $p^n$, we must seek the largest value of $p$. Note that to get to this point, we had to assume $p \leq x_{(1)}$. It follows that $$\hat{p}_{\text{MLE}} = X_{(1)}$$ is the maximum likelihood estimator of $p$.