Statistics – GLR Test of Hypothesis for Exponential Distributions

statistics

I'm having trouble with this exercise from Bain and Engelhardt's textbook:

Consider independent random samples of size $n_1$ and $n_2$ from
respective exponential distributions $X_i \sim EXP(\theta_1)$ and $Y_i
\sim EXP(\theta_2)$. Derive the Generalize Likelihood Ratio test of
$H_0:\theta_1=\theta_2$ versus $H_1:\theta_1\neq\theta_2$.

The Generalized Likelihood Ratio is defined by

$\lambda(\vec{x})=\frac{\max_{\theta\in\Omega_0}f(\vec{x};\vec{\theta})}{\max_{\theta\in\Omega}f(\vec{x};\vec{\theta})}=\frac{f(\vec{x};\hat{\vec{\theta}_0})}{f(\vec{x};\hat{\vec{\theta}})}$,

where $\hat{\vec{\theta}}$ denotes the usual Maximum Likelihood
Estimator of $\vec{\theta}$ and $\hat{\vec{\theta_0}}$ denotes the MLE
under the restriction that $H_0$ is true.

One is then supposed to apply the Neyman-Pearson lemma.

I've thought about this exercise for some time now, unsuccesfully.

Thank you for any help given.

Best Answer

When $\theta_1 \neq \theta_2$, the likelihood is (given the independence assumption) $\mathcal{L}(\theta_1,\theta_2)=\theta_1 ^ {n_1} \exp [-\theta_1 (x_1 + \dots + x_{n_1})] \theta_2 ^ {n_2} \exp [-\theta_2 (y_1 + \dots + y_{n_2})]$.

When $\theta = \theta_1 = \theta_2$, the likelihood is \begin{align} \mathcal{L}(\theta)&=\theta ^ {n_1} \exp [-\theta (x_1 + \dots + x_{n_1})] \theta ^ {n_2} \exp [-\theta (y_1 + \dots + y_{n_2})] \\ &= \theta^{n_1+n_2} \exp [ -\theta ( x_1 + \dots + x_{n_1} + y_1 + \dots + y_{n_2} )] \end{align}

I leave the rest to you.

Related Question