Your claimed result is not true, which probably explains why you're having trouble seeing it.
For simplicity I'll let $a = 0, b = 1$. Results for general $a$ and $b$ can be obtained by a linear transformation.
Let $X_1, \ldots, X_n$ be independent uniform $(0,1)$; let $Y$ be their minimum and let $X$ be their maximum. Then the probability that $X \in [x, x+\delta x]$ and $Y \in [y, y+\delta y]$, for some small $\delta x$ and $\delta y$, is
$$ n(n-1) (\delta x) (\delta y) (x-y)^{n-2} $$
since we have to choose which of $X_1, \ldots, X_n$ is the smallest and which is the largest; then we need the minimum and maximum to fall in the correct intervals; then finally we need everything else to fall in the interval of size $x-y$ in between. The joint density is therefore $f_{X,Y}(x,y) = n(n-1) (x-y)^{n-2}$.
Then the density of $Y$ can be obtained by integrating. Alternatively, $P(Y \ge y) = (1-y)^n$ and so $f_Y(y) = n(1-y)^{n-1}$.
The conditional density you seek is then
$$ f_{X|Y}(x|y) = {n(n-1) (x-y)^{n-2} \over n(1-y)^{n-1}} == {(n-1) (x-y)^{n-2} \over (1-y)^{n-1}}. $$
where of course we restrict to $x > y$.
For a numerical example, let $n = 5, y = 2/3$. Then we get $f_{X|Y}(x/y) = 4 (x-2/3)^3 / (1/3)^4 = 324 (x-2/3)^3$ on $2/3 \le x \le 1$. This is larger near $1$ than near $2/3$, which makes sense -- it's hard to squeeze a lot of points in a small interval!
The result you quote holds only when $n = 2$ -- if I have two IID uniform(0,1) random variables, then conditional on a choice of the minimum, the maximum is uniform on the interval between the minimum and 1. This is because we don't have to worry about fitting points between the minimum and the maximum, because there are $n - 2 = 0$ of them.
This is in the domain of the extremal value theory and I found that a good reference for it is this book. Indeed, its authors point out the nice symmetry between the laws of large numbers and central limit theorems and extreme value theory.
Now, chi-squared distribution with one degree of freedom $\chi^2_1$ is Gamma distribution $\Gamma(1/2,2)$. The distribution of the maximum of $n$ Gamma-distributed random variables converges to Gumbel distribution as $n\rightarrow\infty$, and Table 3.4.4 (see page 156) of the aforementioned reference states that $a_n(\max Y_i-b_n)\rightarrow \Lambda$, where $Y_i\sim\Gamma(k,\theta)$, $\theta$ expresses the scale (rather than the rate) of Gamma distribution, $a_n=1/\theta$, and $b_n=\theta(\ln n+(k-1)\ln \ln n-\ln\Gamma(k))$, and $P(\Lambda\leq x)=e^{-e^{-x}}$.
Thus, in your case $a_n=2$ and $b_n=\ln n-\frac{1}{2}\ln\ln n-\ln\Gamma(\frac{1}{2})$.
Best Answer
Note that, $$\Pr(a_n(M_n - b_n)\le x) = \Pr(M_n\le b_n +x/a_n) = \prod_{i=1}^n \Pr(Y_i\le b_n +x/a_n) = \left(b_n +x/a_n \right)^n.$$ So, one possible choice would be $b_n=1$ and $a_n=n.$ In that case $n(M_n - 1)\Rightarrow Z$ where $-Z$ follows standard exponential distribution.
Let $Y_i = 1/X_i.$ Note that,$$\Pr(a_n(M_n - b_n)\le x) = \prod_{i=1}^n \Pr(Y_i\le b_n +x/a_n) = \prod_{i=1}^n \Pr\left(X_i \ge \left(b_n +x/a_n\right)^{-1}\right) = \left(1-\frac{a_n}{a_n b_n + x}\right)^n.$$ Now, it is well known that if $\displaystyle\lim_{n\to\infty} x_n = x,$ then $\displaystyle\lim_{n\to\infty} (1+x_n/n)^n = e^x.$ Using this fact, we can write $$\lim_{n\to\infty} \left(1-\frac{a_n}{a_n b_n + x}\right)^n = \exp\left(\displaystyle\lim_{n\to\infty}-\frac{na_n}{a_n b_n + x}\right).$$ Thus, we can take $a_n = 1/n$ and $b_n=0,$ for which the last limit will be $\exp(-1/x)$. Thus, $n^{-1}(M_n - 0)\Rightarrow Z$ where $Z$ has CDF $F(x) = \exp(-1/x) \mathbf{1}(x\ge 0).$
Note, if $b_n$ is any sequence such that $b_n/n$ converges, say to $b,$ then $n^{-1}b_n\to b$ and $n^{-1} M_n \Rightarrow Z$ implies that $n^{-1} (M_n - b_n) \Rightarrow Z-b, $ whose CDF is $F(x+b).$