Finding minimal sufficient statistic and maximum likelihood estimator

maximum likelihoodstatistics

Question:

Let $Y_1, \dots, Y_n$ be a random sample from a distribution with density function

\begin{align}
f_Y(y ; \alpha, \theta) =
\begin{cases}
\alpha e^{- \alpha ( x – \theta)} & x > \theta,\\
0 & x \leq \theta,
\end{cases}
\end{align}

where $\alpha, \theta > 0$ are parameters. Find the minimal sufficient statistic and maximum likelihood estimator for $(\alpha, \theta)$ when neither is known.

Attempt:

1) Minimal sufficient statistic

$f_{\boldsymbol{Y}}(\boldsymbol{y}; \alpha, \theta) = \alpha^n e^{-\alpha(\sum y_i – n\theta)} I_{(\theta, \infty)}(y_{(1)})$ where $y_{(1)}$ is the first order statistic and $I$ is an indicator function. According to the factorization theorem, $(\sum y_i, y_{(1)})$ is sufficient for $(\alpha, \theta)$.

To prove minimal sufficiency I need to show that given samples $\boldsymbol{y}$ and $\boldsymbol{x}$, the ratio of the joint densities $\frac{f_{\boldsymbol{Y}}(\boldsymbol{y}; \alpha, \theta)}{f_{\boldsymbol{Y}}(\boldsymbol{x}; \alpha, \theta)}$ involves neither $\alpha$ nor $\theta$ $\Leftrightarrow (\sum y_i, y_{(1)}) = (\sum x_i, x_{(1)})$.

\begin{align}
\frac{f_{\boldsymbol{Y}}(\boldsymbol{y}; \alpha, \theta)}{f_{\boldsymbol{Y}}(\boldsymbol{x}; \alpha, \theta)} &= \frac{\alpha^n e^{-\alpha(\sum y_i – n\theta)} I_{(\theta, \infty)}(y_{(1)})}{\alpha^n e^{-\alpha(\sum x_i – n\theta)} I_{(\theta, \infty)}(x_{(1)})}\\
&= e^{-\alpha (\sum x_i – \sum y_i)} \frac{I_{(\theta, \infty)}(y_{(1)})}{I_{(\theta, \infty)}(x_{(1)})}.
\end{align}

For this expression to not involve either $\alpha$ or $\theta$, I need $\sum x_i = \sum y_i$. My problem is that I don't see why I need $y_{(1)} = x_{(1)}$. As far as I can tell, $y_{(1)}, x_{(1)} > \theta$ is good enough.

Where am I going wrong?

2) Maximum likelihood estimator

For $\alpha, \theta > 0$,

\begin{align}
L(\alpha, \theta ; \boldsymbol{y}) &= \alpha^n e^{-\alpha(\sum y_i – n\theta)}\\
\Rightarrow \ell (\alpha, \theta ; \boldsymbol{y}) &= n \log \alpha – \alpha \big( \sum y_i – n\theta \big)\\
\frac{\partial \ell}{\partial \alpha} &= \frac{n}{\alpha} – \big( \sum y_i – n\theta \big).
\end{align}

Setting $\frac{\partial \ell}{\partial \alpha} = 0$, I get $\hat{\alpha} = \frac{n}{\sum y_i – n \theta}$.

To find $\hat{\theta}$, I look at $L(\alpha, \theta ; \boldsymbol{y}) = \alpha^n e^{-\alpha \sum y_i} e^{\alpha n \theta} I_{(\theta, \infty)}(y_{(1)})$ and I see that $L$ gets larger as $\theta$ increases from $0$ right until $\theta = y_{(1)}$, at which point $L = 0$. Since the MLE is defined to be a supremum, I conclude that $\hat{\theta} = y_{(1)}$.

So the maximum likelihood estimator for $(\alpha, \theta)$ is $\big( \frac{n}{\sum y_i – n \theta}, y_{(1)}\big)$.

Is this the correct MLE?

Thank you.

Best Answer

It's more or less correct, but some very minor issues...

  1. to verify minimality, your ratio is

$$\frac{e^{-\alpha \sum_ix_i}\times\mathbb{1}_{(0;x_{(1)})}(\theta)}{e^{-\alpha \sum_iy_i}\times\mathbb{1}_{(0;y_{(1)})}(\theta)}$$

This ratio is independent by $(\alpha;\theta)$ iff

$$ \begin{cases} \sum_ix_i=\sum_iy_i \\ x_{(1)}=y_{(1)} \end{cases}$$

  1. Considering the profile likelihood,

$$\hat{\theta}=x_{(1)}$$

Then, substituting $\hat{\theta}$ with $\theta$ you get

$$\hat{\alpha}=\frac{n}{\sum_i[x_i-x_{(1)}]}$$

Thus

$$(\hat{\alpha};\hat{\theta})=(\frac{n}{\sum_i[x_i-x_{(1)}]};x_{(1)})$$

Related Question