Hypothesis Testing – How to Obtain a Level-$\alpha$ Likelihood Ratio Test for Specified Hypotheses

confidence intervalhypothesis testinglikelihoodlikelihood-ratiomathematical-statistics

Suppose I have a sequence of iid random variables $X_1, \ldots, X_n$ following the pdf:

$$
f_\theta (x) = \theta x^{\theta-1}
$$

for $\theta >0$ and $0 <x<1$.

I would like to obtain a level-$\alpha$ likelihood ratio test for the null hypothesis $H_0: \theta = \theta_0$ versus the two-sided alternative $H_1: \theta \neq \theta_0$ where $\theta_0$ is a known constant.

MY ATTEMPT:

I first construct the ratio:

\begin{align}
\lambda(x) &= \dfrac{sup_{\theta=\theta_0}L(\theta|X)}{sup_{\theta\neq\theta_0}L(\theta|X)} \\
&= \dfrac{\theta_0^n \left(e^{\sum log x_i}\right)^{\theta_0-1}}{\left(\frac{n}{-\sum logx_i}\right)^n \left(e^{\sum logx_i}\right)^{\left(\frac{n}{-\sum log x_i} – 1\right)}} \\
&= \left(\dfrac{-\theta_0 \sum log x_i}{n}\right)^n e^{n+\theta_0 \sum log x_i}
\end{align}

The denominator is calculated using the MLE of $\theta$ which is $\theta^{MLE} = \frac{n}{-\sum log x_i}$. Now, I'd like to find the likelihood ratio test, in that I would like to choose a constant $c$ such that:

$$
\alpha = sup_{\theta = \theta_0}P_\theta\left(\lambda(x) \leq c\right)
$$

Now, $-\sum logx_i \sim Gamma(n, \theta)$ but I CANNOT isolate the above equation due to the $log$ form. What is the right answer here? Thanks!

Best Answer

Using whuber's method, we'll reject if $\ell(\theta_0; \vec x) - \ell(\theta_{MLE}; \vec x) \le k$ for some constant $k$, where $\ell(\theta; \vec x) = \ln L(\theta ; \vec x) = \ln \left(\theta^n \prod_{i=1}^n x_i^\theta \right) = n \ln \theta + \theta \sum_{i=1}^n \ln x_i$.

We see that $\ell(\theta_0; \vec x) - \ell(\theta_{MLE}; \vec x) = n (\ln \theta_0 - \ln \theta_{MLE}) + (\theta_0 - \theta_{MLE}) \sum_{i=1}^n \ln x_i$.

Using some generic constant $k$ at each step, we reject if \begin{align*} n (\ln \theta_0 - \ln \theta_{MLE}) + (\theta_0 - \theta_{MLE}) \sum_{i=1}^n \ln x_i \le k \\ -n \ln \theta_{MLE} + (\theta_0 - \theta_{MLE}) \left( - \frac{n}{\theta_{MLE}} \right) \le k \\ -n \ln \theta_{MLE} - \frac{n\theta_0}{\theta_{MLE}} \le k \\ \ln \theta_{MLE} + \frac{\theta_0}{\theta_{MLE}} \ge k \end{align*}

Now, consider the function $f(z) = \ln(z) + \frac{\theta_0}{z}$. This has a minimum at $z = \theta_0$ and is concave up and $\lim \limits_{z \to 0^+} f(z) = \lim \limits_{z \to \infty} f(z) = \infty$. So $f(z) \ge k$ means $z$ is either sufficiently small or sufficiently large.

That is, $ - \frac{n}{\sum_{i=1}^n \ln(x_i)}$ is either sufficiently small or sufficiently large; i.e. $- \sum_{i=1}^n \ln x_i \le c_1 $ or $ - \sum_{i=1}^n \ln x_i \ge c_2$. We want to choose $c_1$ and $c_2$ so that the test is size $\alpha$ under $H_0$.

Note that under $H_0$, $- \sum_{i=1}^n \ln X_i \sim \Gamma(n,\theta_0) \implies -2\theta_0 \sum_{i=1}^n \ln X_i \sim \Gamma(n,\frac{1}{2} ) = \chi^2(2n).$

So one LRT of size $\alpha$ is to reject $H_0$ in favor of $H_1$ if $- 2 \theta_0 \sum_{i=1}^n \ln X_i \le \chi^2_{1-\alpha/2}(2n)$ or if $- 2 \theta_0\sum_{i=1}^n \ln X_i \ge \chi^2_{\alpha/2}(2n)$.

Related Question