[Math] Existence of Maximum Likelihood Estimator

statistical-inferencestatistics

Let $X_1, X_2,\ldots$ be a random sample from the probability density function for $0<\mu<\infty$, $0<\alpha < 1$

$$f(x;\mu,\alpha)= \begin{cases}\frac{1}{\Gamma(\alpha)}(x-\mu)^{\alpha-1}e^{-(x-\mu)}, & x>\mu \\ 0, & \text{otherwise} \end{cases}$$

Does Maximum Likelihood Estimator exist for parameters $\alpha$ and $\mu$ ?

By observation, this looks like a Weibull distribution. Also, I know that we can obtain MLE by taking product of the PDF from $1$ to $n$ and differentiating wrt to the parameter.

But I do not know about the criteria that a parameter needs to satisfy in order to have Maximum Likelihood Estimator.

Best Answer

\begin{align} & \text{For } \mu \le \min\{x_1,\ldots,x_n\} \text{ and } \alpha>0, \text{we have} \\[10pt] L(\mu,\alpha) & = \frac 1 {\Gamma(\alpha)^n} \left( \prod_{i=1}^n (x_i-\mu) \right)^{\alpha-1} \!\!\! \exp \left( -\sum_{i=1}^n (x_i-\mu) \right), \\[10pt] \ell(\mu,\alpha) & = \log L(\mu,\alpha) = -n\log\Gamma(\alpha) + (\alpha-1) \sum_{i=1}^n \log(x_i-\mu) - \sum_{i=1}^n (x_i-\mu). \end{align}

You gave us $\alpha<1.$

That implies $\alpha-1<0,$ so that $\ell(\mu,\alpha)$ is an increasing function of $\mu$ until $\mu$ gets as big as $\min\{x_1,\ldots,x_n\}.$

Therefore $\widehat\mu = \min\{x_1,\ldots,x_n\}.$ If we didn't have the constraint that $\alpha<1,$ then this would be more complicated.

This value of $\widehat\mu$ does not depend on $\alpha$ as long as $\alpha$ remains in that interval. Therefore we can just plug in $\min$ for $\mu$ and then seek the value of $\alpha\in(0,1)$ that maximizes $\ell(\min,\alpha).$

Now we have $$ \ell(\min,\alpha) = -n\log\Gamma(\alpha) + (\alpha-1)A + \big( \text{constant} \big) $$ where "constant" means not depending on $\alpha.$ $$ \frac {\partial\ell}{\partial\alpha} = -n\frac{\Gamma'(\alpha)}{\Gamma(\alpha)} + A. $$ Etc.