[Math] For a random sample from the distribution $f(x)=e^{-(x-\theta)} , x>\theta$ , show that $2n[X_{(1)}-\theta]\sim\chi^2_{2}$

order-statisticsprobabilityprobability distributionsprobability theorystatistics

Show that for a random sample of size $n$ from the distribution $f(x)=e^{-(x-\theta)} , x>\theta$ , $2n[X_{(1)}-\theta] \sim \chi^2_{2}$ distribution and $2\sum_{i=2}^{n}[X_{(i)}-X_{(1)}]$ also has the $\chi^2_{2n-2}$ distribution and is independent of the first statistic.
Here, $X_{(i)}$ is defined as the $i$ th order statistic.

My approach:

I did the following series of transformations:
$(X_1,X_2,..,X_n) \rightarrow (Y_1,Y_2,…,Y_n) \rightarrow (Y_{(1)},Y_{(2)},…,Y_{(n)}) \rightarrow (U_1,U_2,…U_n)$

where $Y_i=X_i-\theta$ , $U_1=2nY_{(1)}$ and $U_{i}=2(Y_{(i)}-Y_{(1)}) \ \text{for i =2,3,…n}$

SO, first the joint pdf of $X_1,X_2,…X_n$ is given by

$f(x_1,x_2,…x_n)=e^{-\sum_{i=1}^{n}(x_i-\theta)} I_{x_i > \theta}$

Again, you can see $f(y_1,y_2,..,y_n)=e^{-\sum y_i} I_{y_i>0}$
Now, the joint pdf of order statistics $f_{1,2,…n}(y_1,..y_n)=n!e^{-\sum y_i} I_{y_1<y_2<…<y_n}$
Now transforming to $U$, the jacobian of transformation comes to be $\frac{1}{n2^n}$
Thus, $f(u_1,u_2,..u_n)=\frac{(n-1)!}{2^n}e^{\frac{-\sum u_i}{2}}$
From here I can deduce $u_1 \sim \chi^2_{2}$ But I cannot deduce anything from the remaining.
Help!

Best Answer

I think a more easy to follow (and simpler) proof would be to use a different change of variables.

We have the joint density of the order statistics $(U_1=X_{(1)},\cdots,U_n=X_{(n)})$

$$f_{\mathbf U}(u_1,\cdots,u_n)=n!\exp\left[-\sum_{i=1}^nu_i+n\theta\right]\mathbf1_{\theta<u_1<u_2<\cdots<u_n}$$

Now transform $(U_1,\cdots,U_n)\to(Y_1,\cdots,Y_n)$ such that $Y_i=(n-i+1)(U_i-U_{i-1})$ for all $i=1,2\cdots,n$ and taking $U_0=\theta$.

It follows that $\sum_{i=1}^nu_i=\sum_{i=1}^ny_i+n\theta$. The jacobian determinant comes out as $n!$.

So you get the joint density of $(Y_1,\cdots,Y_n)$

$$f_{\mathbf Y}(y_1,\cdots,y_n)=\exp\left[-\sum_{i=1}^ny_i\right]\mathbf1_{y_1,\cdots,y_n>0}$$

Not surprisingly, the spacings of successive order statistics from an exponential sample come out as independent . In fact, the $Y_i$'s are i.i.d exponential with mean $1$ for all $i=1,2,\cdots,n$.

This implies $2Y_i\stackrel{\text{i.i.d}}{\sim}\chi^2_2$ for all $i=1,2,\cdots,n$

So we have two independent variables $2Y_1$ and $\sum_{i=2}^n2Y_i$. Both have the chi-square distribution --- the former with $2$ degrees of freedom and the latter with $2n-2$ degrees of freedom.

It is now a matter of time to see that $2Y_1=2n(X_{(1)}-\theta)$ and $2\sum_{i=2}^nY_i=2\sum_{i=2}^n(X_{(i)}-X_{(1)})$.