It looks right and you can check it using the factorization criteria, i.e.,
$$
L(\lambda; X_1,...,X_n) = \lambda^{-2n} \exp\{-1/\lambda^2 \sum X_i^2\} \times2^n\prod X_i,
$$
hence, as $g(\lambda; T(X) )=\lambda^{-2n} \exp\{-1/\lambda^2 \sum X_i^2\}$, thus $\sum X_i^2$ is the minimal sufficient statistic for $\lambda$.
Probability of the complementary event is $$P\left[\bigcup_{j=1}^n\left\{\frac{nU_{(j)}}{j}\le \alpha\right\}\right]=P\left[\min_{1\le j\le n}\left\{\frac{nU_{(j)}}{j}\right\}\le \alpha\right]$$
Using induction we can show that $T_n=\min\limits_{1\le j\le n}\left\{\frac{nU_{(j)}}{j}\right\}$ has a uniform distribution on $(0,1)$.
For $n=1$, clearly $T_1=U_1 \sim U(0,1)$. Now suppose $T_{k-1}\sim U(0,1)$.
Then for $x\in [0,1]$,
\begin{align}
P(T_k \le x)&=\int_0^1 P(T_k\le x \mid U_{(k)}=t)f_{U_{(k)}}(t)\,dt
\\&=\int_0^x f_{U_{(k)}}(t)\,dt + \int_x^1 P(T_k\le x \mid U_{(k)}=t)f_{U_{(k)}}(t)\,dt
\\&=\int_0^x k t^{k-1}\,dt + \int_x^1 P\left(T_{k-1}\le \left(\frac{k-1}{kt}\right)x \mid U_{(k)}=t \right)kt^{k-1}\,dt \tag{1}
\\&= x^k + \int_x^1 \left(\frac{k-1}{kt}\right) x\cdot kt^{k-1}\,dt \tag{2}
\\&= x
\end{align}
In $(1)$, we used
\begin{align}
T_k \le x &\iff \min_{1\le j\le k-1}\left\{\frac{k U_{(j)}}{j}\right\} \le x \qquad \left[\because\, x<t<1 \right]
\\& \iff \min_{1\le j\le k-1}\left\{\frac{(k-1) U_{(j)}}{jt}\right\} \le \left(\frac{k-1}{kt}\right) x
\\& \iff T_{k-1} \le \left(\frac{k-1}{kt}\right) x
\end{align}
And $(2)$ follows from $(1)$ using the independence of $\left(\frac{U_{(1)}}{U_{(k)}},\frac{U_{(2)}}{U_{(k)}},\ldots,\frac{U_{(k-1)}}{U_{(k)}}\right)$ and $U_{(k)}$.
So, $T_k \sim U(0,1)$ whenever $T_{k-1}\sim U(0,1)$.
Hence $T_n \sim U(0,1)$ for every $n$ and your desired probability is $1-\alpha$.
Reference:
An improved Bonferroni procedure for multiple tests of significance by R.J. Simes.
Best Answer
Assuming $X_1,\ldots,X_n$ are independent and identically distributed,
$$E\left[\sum_{i=1}^n \frac{X_i}{X_{(n)}}\right]=\sum_{i=1}^n E\left[\frac{X_{i}}{X_{(n)}}\right]=nE\left[\frac{X_1}{X_{(n)}}\right]$$
This can be evaluated by showing that $X_1/X_{(n)}$ is independent of $X_{(n)}$, so that
$$E\left[X_1\right]=E\left[\frac{X_1}{X_{(n)}}\cdot X_{(n)}\right]=E\left[\frac{X_1}{X_{(n)}}\right]\cdot E\left[X_{(n)}\right]$$
Thus giving $$E\left[\frac{X_1}{X_{(n)}}\right]=\frac{E\left[X_1\right]}{E\left[X_{(n)}\right]}$$
The independence can be argued using Basu's theorem since the distribution of $\frac{X_1}{X_{(n)}}=\frac{X_1/b}{X_{(n)}/b}$ is free of $b$ (making $\frac{X_1}{X_{(n)}}$ an ancillary statistic) and $X_{(n)}$ is a complete sufficient statistic for $b$.