Deriving a level-alpha likelihood-ratio test for simple null hypothesis versus two-sided composite alternate hypothesis

hypothesis testingstatistical-inferencestatistics

Let $X_1,…,X_n$ be iid $N(0,\sigma^2)$ where $\sigma>0$ is unknown. With a preassigned $\alpha$ in $(0,1)$, my task is to derive a level-alpha LR test for $H_0: \sigma=\sigma_0$ versus $H_1: \sigma \ne \sigma_0$. Here is my work thus far:

For some appropriate $c$ in $(0,1)$, we will reject the null if $c>\lambda(x)=\frac{L(\sigma_0|x)}{sup_{\Theta}L(\sigma|x)}$. Because the denominator is unrestricted with respect to $\sigma$, we will use $\sigma=n^{-1}\sum_{i=1}^nX_i^2$ for our expression in the denominator. After some algebra, I get this:

Reject the null if $c>\lambda(x)=(\frac{n^{-1}\sum_{i=1}^nX_i^2}{\sigma_0})^{^n}e^{\frac{-n^{-1}\sum_{i=1}^nX_i^2}{2\sigma_0^2}+\frac{n}{2}} $. I don't believe it's possible to isolate the sum of the squares in this case, so how can I identify the rejection rejection? Can I use the Karlin-Rubin Theorem after finding the distribution of the sum of squares?

Edit: I don’t think I can use Karlin-Rubin because the alternate in this question is two-sided composite.

Best Answer

You have to simplify the likelihood ratio statistic and study the nature of the resulting function (often easier to consider the ratio as a function of a sufficient statistic) to find the cutoff point subject to a level/size restriction.

The likelihood function given the sample $(x_1,\ldots,x_n)\in\mathbb R^n$ is $$L(\sigma)=\frac{1}{(\sigma\sqrt{2\pi})^n}\exp\left[-\frac{1}{2\sigma^2}\sum_{i=1}^nx_i^2\right]\quad,\,\sigma>0$$

Unrestricted MLE of $\sigma$ is $$\hat\sigma=\sqrt{\frac{T}{n}}\quad,\,T=\sum_{i=1}^n x_i^2$$

So the LR test statistic is

\begin{align} \Lambda(x_1,\ldots,x_n)&=\frac{L(\sigma_0)}{L(\hat\sigma)} \\\\&=\left(\frac{T}{n\sigma_0^2}\right)^{n/2}e^{-\frac{1}{2}\left(T/\sigma_0^2-n\right)} \end{align}

Therefore, for some $k$, $$\Lambda(x_1,\ldots,x_n)<c\implies\underbrace{ n\ln T-T/\sigma_0^2}_{g(T)}<k$$

You can verify that $g$ is a concave function from $g''(T)<0$.

If you sketch a rough plot of $g$, you will see that for some $(c_1,c_2)$ with $c_1<c_2$,

$$g(T)<k\implies T<c_1\quad\text{ or }\quad T>c_2$$

Under $H_0$, $$T/\sigma_0^2 \sim \chi_n^2$$

And your $(c_1,c_2)$ is such that $$P_{H_0}(T<c_1)+P_{H_0}(T>c_2)=\alpha$$, and $$g(c_1)=g(c_2)$$