Solved – Likelihood ratio test for comparing two exponential distributions

likelihoodlikelihood-ratiomaximum likelihoodparameterizationr

I am trying to use a likelihood ratio test to compare the parameters of two exponential distributions.

by this thread

Likelihood Ratio for two-sample Exponential distribution

I found that I can use

$$
\frac {L_{H_1}(\hat \theta_1,\,\hat \theta_2)}{L_{H_0}(\hat \theta_0)} = \frac {(\hat \theta_0)^{n_1+n_2}}{(\hat \theta_1)^{n_1}(\hat \theta_2)^{n_2}}=\left(\frac {\hat \theta_0}{\hat \theta_1}\right)^{n_1} \cdot \left(\frac {\hat \theta_0}{\hat \theta_2}\right)^{n_2}
$$

to calculate the likelihood ratio statistic.

(with mn1/mn2 being the means of the two datasets)

However, something doesn't seem right to me…I keep getting the statistic = 1 when i plug it into r.

Is this accurite? or does anyone know a likelihood ratio function to compare two distributions in r?

Best Answer

The answer from the link provides a formula of ratio of likelihoods of the null and the alternative hypotheses with detailed derivation, and the R code below is my implementation for it with $\theta_1 = 1$, $\theta_2 = 2$, $n_1 = 70$, and $n_2 = 100$.

To generate samples, get cdf for exponential distribution first. If $f(x) = \frac{1}{\theta}e^{-\frac{x}{\theta}}$, F(x) = $\int_{0}^{x} \frac{1}{\theta}e^{-\frac{t}{\theta}} dt = 1 - e^{-\frac{x}{\theta}}$. If $u = 1 - e^{-\frac{x}{\theta}},$ then $x = -\theta*ln(1-u)$. So get $n_1$ samples $u \in [0, 1)$ and compute $x_i$ values with $\theta = \theta_1$ and $n_2$ samples $\in [0, 1)$ and compute $y_i$ values with $\theta = \theta_2$, and then follow the likelihood ratio formula to compute the ratio. The computed ratio is not always 1 using the formula after testing a few times.

theta_ln_1Minusu <- function(theta, vec){ size = length(vec); for(i in 1:size){ vec[i] = -theta*log(1-vec[i]); } return (vec); }

>theta1 = 1; theta2 = 2; n1 = 70; n2 = 100;
>u1vals = runif(n1); u2vals = runif(n2); 
> xvals = theta_ln_1Minusu(theta1, u1vals);
> yvals = theta_ln_1Minusu(theta2, u2vals);
> x_avg = sum(xvals) / n1; y_avg = sum(yvals) / n2; 
> x_avg
[1] 1.041831
> y_avg
[1] 1.733426
> w1 = n1/(n1+n2);w2 = n2/(n1+n2);
> likelihoodRatio = (w1+w2*y_avg/x_avg)^n1*(w1*x_avg/y_avg+w2)^n2;
> likelihoodRatio
[1] 168.8613

On the comment below:

In addition, regarding EngrStudent's comment, I spent a while to compute and found that the ratio is close to 1 only when x_avg is close to y_avg; in other words, $\theta_1$ is close to $\theta_2$. Denote $\frac{yavg}{xavg}$ as $x$, $x > 0$, divide the last term by $(n_1+n_2)$, and then compute the log of the last term in the link:

$w_1*ln(w_1+w_2x) + w_2*ln(w_1/x+w_2) = (w_1+w_2)*ln(w_1+w_2x) - w_2*ln(x) =ln(w_1+w_2x) -ln(x^{w_2})$

If this log value is 0, then $w_1+w_2x=x^{w_2}$. It is known that this equality holds when x = 1. The both sides are continuous. Now check the derivatives of both sides. The left side's is $w_2$, and the right side's is $w_2*x^{w_2-1}$. The increment rate of the left side is fixed to $w_2$, and $ w_2< 1$. If $x < 1$, $x^{w_2-1} > 1$, the increment rate of the right side is always larger than that of the left, $w_2$, and the left is larger than the right when x>0 close to 0. And if $x > 1$, the increment rate of the right side is always smaller than $w_2$. Thus, the equality holds only when $x = 1$.