Solved – MLE for 2 parameter exponential distribution

estimationmaximum likelihoodself-study

Let $x_1, x_2… x_n$ be a random sample from a distribution with pdf:

$$f(x;\mu,\sigma)=\frac1{\sigma}\exp\left({-\frac{x-\mu}{\sigma}}\right)\,,-\infty<\mu<\infty;\, \sigma>0;\, x\ge\mu$$

How do I find the MLE for the parameters if both parameters are unknown?

I tried using the usual MLE with likelihood function:

$$L(\mu,\sigma|x_1…x_n)=\frac{1}{\sigma^n}\exp\left({-\frac{\sum{x_i}-n\mu}{\sigma}}\right)$$ But the derivative of this with respect of $\mu$ is a dead end.

I do know that if $\sigma$ is known, the MLE for $\mu$ is $\frac{\sum{x_i}}{n}$ and if $\mu$ is known, the MLE for $\sigma$ is $\frac{\sum{x_i}-n\mu}{n}$.

Do these help? What should be the approach?

Best Answer

Given the sample, the likelihood function is given by $$L(\mu,\sigma)=\frac{1}{\sigma^n}\exp\left[-\frac{1}{\sigma}\sum_{i=1}^n(x_i-\mu)\right]\mathbf1_{\mu\leqslant x_{(1)},\sigma>0}$$

This function is not differentiable at $\mu=x_{(1)}$, so that MLE of $\mu$ has to be found using a different argument. For fixed $\sigma$, $L(\mu,\sigma)$ is an increasing function of $\mu$ $\,\forall\,\sigma$, implying that $\hat\mu_{\text{MLE}}=X_{(1)}$.

MLE of $\sigma$ can be guessed from the first partial derivative as usual.

We have $\displaystyle\frac{\partial L(\mu,\sigma)}{\partial\sigma}=0\implies\sigma=\frac{1}{n}\sum_{i=1}^n(x_i-\mu)$.

So MLE of $\sigma$ could possibly be $\displaystyle\hat\sigma_{\text{MLE}}=\frac{1}{n}\sum_{i=1}^n(X_i-\hat\mu)=\frac{1}{n}\sum_{i=1}^n\left(X_i-X_{(1)}\right)$

The second partial derivative test fails here due to $L(\mu,\sigma)$ not being totally differentiable.

So to confirm that $(\hat\mu,\hat\sigma)$ is the MLE of $(\mu,\sigma)$, one has to verify that $L(\hat\mu,\hat\sigma)\geqslant L(\mu,\sigma)$, or somehow conclude that $\ln L(\hat\mu,\hat\sigma)\geqslant \ln L(\mu,\sigma)$ holds $\forall\,(\mu,\sigma)$.