LR-test for a Pareto$(1,\beta)$-distribution

hypothesis testingstatistics

Let $X_1,\ldots,X_n$ be a sample from a Pareto$(1,\beta)$-distribution with density

$$f(x \mid \beta) = \begin{cases}
\beta x^{-(\beta+1)} & x \ge 1 \\
0 & \, \text{else}
\end{cases}$$

We test for a given $\gamma \in \mathbb{R}$ if $H_0: \beta \ge \gamma, H_A: \beta < \gamma$. On which statitistic does the LR-test for $\beta$ depennd upon and what is its area of acceptence?

So far I have computed the MLE for $\beta$:

$L_n(\beta \mid X) = \prod_{i=1}^n \beta x_i^{-(\beta+1)} = \beta^n\prod_{i=1}^n x_i^{-\beta+1}$

The log-likelihood is thus:

$l(\beta \mid X) = n \log(\beta) – (\beta+1) \sum_{i=1}^n \log(x_i)$

$l^\prime (\beta \mid X) = \frac{n}{\beta} – \sum_{i=1}^n \log(x_i)$

Therefore the MLE $\hat{\beta}$ for $\beta$ is given by

$$\hat{\beta} = \frac{n}{\sum_{i=1}^n \log(x_i)}$$

I know that the LR test is given by

$$\frac{L(\hat{\beta_0} \mid X)}{L(\hat{\beta} \mid X)}$$

, however, I do not see what I should do now, I mean I see no way to simplify this quotient. Could you please give me a hint?

Best Answer

So let us say that we want to LR test in level $\alpha (0< \alpha < 1)$. We denote the power function as $\psi(\beta)$. By the definition of the size of the test, $\alpha = \underset{\beta \geq \gamma}{\sup}\psi(\beta)$. Since you've already found the MLE, we are going to look for the null MLE from the subset $\Theta^0$ of $\Theta$. $$ l(\beta) = [n(\log(\beta) - (\beta + 1)\overline{\log x})]\mathbb{I}_{[1,\infty)}x\\ \hat{\beta}^0 = \gamma \wedge 1/\overline{\log X} $$ So, we can now calculate LRT, $$ L_n = 2(l(\hat{\beta})-l(\hat{\beta}^0)) = 2n(\gamma \overline{\log X} - 1 - log(\gamma \overline{\log X}))\mathbb{I}(\gamma < \overline{\log X}) $$ Since the LRT is a convex function of $\gamma \overline{\log X}$, the form of critical region is $$ 2n(\gamma \overline{\log X} - 1 - log(\gamma \overline{\log X})) \geq c\\ \Leftrightarrow \gamma \overline{\log X} \leq c_1, \gamma \overline{\log X} \geq c_2\\ \text{where $c_1-\log c_1 = c_2 = \log c_2$} $$ Then, we attained the power function, $$ \psi(\beta) = \mathbb{P}_\beta (\gamma \overline{\log X} \leq c_1) + \mathbb{P}_\beta (\gamma \overline{\log X} \geq c_2) $$ To attain the constants $c_1, c_2$ corresponding to the test size $\alpha$, we need to see if we can get a distribution that is related to the function we have above. By variable transformation, $$ \log X \sim Exp(\frac{1}{\beta}) \implies 2n\beta\log X \sim \chi^2(2n)\\ \therefore \psi(\beta) = \mathbb{P}_\beta (\chi^2(2n) \leq \frac{2n\beta}{\gamma}c_1) + \mathbb{P}_\beta (\chi^2(2n) \geq \frac{2n\beta}{\gamma}c_2)) $$ I assume the power function to be a decreasing function of beta, so that we can specify the critical region like below, $$ \gamma \overline{\log X} \leq c_1, \gamma \overline{\log X} \geq c_2\\ \text{where $c_1-\log c_1 = c_2 = \log c_2$}\\ \int_{2nc_1}^{2nc_2}pdf_{\chi^2(2n)}(y)dy = 1-\alpha $$

Related Question