Statistics – MLEs for Shifted Exponential Distribution: How to Calculate Correctly

exponential distributionmaximum likelihoodstatistics

I have the i.i.d. exponential random variables $X_1, \dots, X_n$ with the density functions

$$f(x; \sigma, \tau)=
\begin{cases}
\dfrac{1}{\sigma} e^{-(x – \tau)/\sigma} &\text{if}\, x\geq \tau\\
0 &\text{otherwise}
\end{cases}$$

I want to calculate the maximum likelihood estimators for $\sigma$ and $\tau$. I calculate the likelihood as

$$L(\sigma \mid \mathbf{x}) = \prod_{i = 1}^n f(x_i; \sigma, \tau)\mathbf{1}_{x \ge \tau} = \prod_{i = 1}^n \dfrac{1}{\sigma} \exp{ \left\{ -\left( \dfrac{x_i – \tau}{\sigma} \right) \right\} }\mathbf{1}_{x \ge \tau} = \dfrac{1}{\sigma^n} \exp{ \left\{ -\left( \dfrac{\sum_{i = 1}^n x_i – \tau}{\sigma} \right) \right\} }\mathbf{1}_{x \ge \tau}$$

So now to find the MLEs, we just calculate $\dfrac{\partial{L}}{\partial{\sigma}} = 0$ and $\dfrac{\partial{L}}{\partial{\tau}} = 0$, right?

$\dfrac{\partial{L}}{\partial{\sigma}} = \dfrac{-\exp{ \left\{ -\left( \dfrac{ \sum_{i = 1}^n x_i + \tau}{\sigma} \right) \right\} } (-y – d) – n\exp{ \left\{ -\left( \dfrac{ \sum_{i = 1}^n x_i + \tau}{\sigma} \right) \right\} } (\sigma)}{x^{2 + n}} = 0$

I strongly suspect that I'm doing this incorrectly. Can someone please show me what I'm doing incorrectly and how we actually calculate these two MLE?

Best Answer

Your likelihood function is

${L}=\frac{1}{\sigma^{n}} \exp\left\{\frac{-\sum_{i=1}^{n}\left(x_{i}-\tau\right)}{\sigma}\right\}1_{\min\left\{x_{i}\right\}\geq \tau}$

where we used that $\prod_{i=1}^{n}1_{x_{i}\geq\tau}=1_{\min\left\{x_{i}\right\}\geq \tau}$.

First, let's find the MLE of $\tau$. Notice that $L$ is increasing in $\tau$ for $\min\left\{x_{i}\right\}\geq \tau$. Therefore, the value of $\tau$ that maximizes the likelihood function is $\tau^{MLE}=\min\left\{x_{i}\right\}$, because if $\tau>\min\left\{x_{i}\right\}$ the $L=0$.

Now let's turn to the MLE estimator of $\sigma$. A useful tool is to maximize the log-likelihood instead of the likelihood. Since it is a monotonic transformation then the value of $\sigma$ that maximizes $L$ is the same that the one that maximizes $\log L$. The log likelihood for $\min\left\{x_{i}\right\}\geq \tau$ (because of our previous discussion) is

$ \log L = -n\log \sigma - \frac{\sum_{i=1}^{n}\left(x_{i}-\tau\right)}{\sigma}$

Taking the derivative: $ \frac{\partial \log L}{\partial\sigma}=0\implies \sigma^{MLE}=\frac{\sum_{i=1}^{n}\left(x_{i}-\tau\right)}{n}$

Since $\sigma^{MLE}$ depends on $\tau$ which is unkown it is frequent to replace $\tau$ with $\tau^{MLE}$. This is called the plug-in principle. Therefore, the MLE estimates for $\tau$ and $\sigma$ are: $ \tau^{MLE}=\min\left\{x_{i}\right\}\qquad \sigma^{MLE}=\frac{\sum_{i=1}^{n}\left(x_{i}-\min\left\{x_{i}\right\}\right)}{n}$

Now to your final question of whether these estimates are unbiased or not. You can easily check that $\tau^{MLE}$ is not unbiased. Intuitively: if you took many samples the minimum will be on avarege above $\tau$. For $\sigma$, if $\tau^{MLE}$ was unbiased then it would be a direct result that $\sigma^{MLE}$ is unbiased, since it is just the sample mean. However, because $\tau^{MLE}$ is biased then $\sigma^{MLE}$ will also be biased.

Related Question