Your claimed result is not true, which probably explains why you're having trouble seeing it.
For simplicity I'll let $a = 0, b = 1$. Results for general $a$ and $b$ can be obtained by a linear transformation.
Let $X_1, \ldots, X_n$ be independent uniform $(0,1)$; let $Y$ be their minimum and let $X$ be their maximum. Then the probability that $X \in [x, x+\delta x]$ and $Y \in [y, y+\delta y]$, for some small $\delta x$ and $\delta y$, is
$$ n(n-1) (\delta x) (\delta y) (x-y)^{n-2} $$
since we have to choose which of $X_1, \ldots, X_n$ is the smallest and which is the largest; then we need the minimum and maximum to fall in the correct intervals; then finally we need everything else to fall in the interval of size $x-y$ in between. The joint density is therefore $f_{X,Y}(x,y) = n(n-1) (x-y)^{n-2}$.
Then the density of $Y$ can be obtained by integrating. Alternatively, $P(Y \ge y) = (1-y)^n$ and so $f_Y(y) = n(1-y)^{n-1}$.
The conditional density you seek is then
$$ f_{X|Y}(x|y) = {n(n-1) (x-y)^{n-2} \over n(1-y)^{n-1}} == {(n-1) (x-y)^{n-2} \over (1-y)^{n-1}}. $$
where of course we restrict to $x > y$.
For a numerical example, let $n = 5, y = 2/3$. Then we get $f_{X|Y}(x/y) = 4 (x-2/3)^3 / (1/3)^4 = 324 (x-2/3)^3$ on $2/3 \le x \le 1$. This is larger near $1$ than near $2/3$, which makes sense -- it's hard to squeeze a lot of points in a small interval!
The result you quote holds only when $n = 2$ -- if I have two IID uniform(0,1) random variables, then conditional on a choice of the minimum, the maximum is uniform on the interval between the minimum and 1. This is because we don't have to worry about fitting points between the minimum and the maximum, because there are $n - 2 = 0$ of them.
The best and simpler way to derive the distribution of the sum of two independent Uniform random variates is geometrical requiring working out some area calculations in 2-D.
However, the derivation of the distribution of $\sum_{i=1}^{n}X_i$ for $n>2$ with each $X_i$ independently distributed as $U(0,1)$ is generally tedious. Geometrically it is difficult to visualise for higher values of $n$. However, a convolution approach would be used to find them (that I will use here kind of recursively).
I start assuming you know the distribution of $A=X_1+X_2$ given by the below pdf:
$$f_A(a) = \begin{cases} a & \text{if $0 \le a \le 1$}\\ 2-a & \text{if $1 \le a \le 2$}\\ 0 & \text{elsewhere}\end{cases}$$
For $n=3$, define $B=X_1+X_2+X_3=A+X_3$. Note $0\le B\le3$. Now, by convolution of pdfs, the pdf of B: $f_B(b)=\int_{-\infty}^\infty f_{X_3}(x_3)f_A(b-x_3)\,dx_3$
Note-1: $f_A(b-x_3)=b-x_3$ for $0\le b-x_3\le1$, i.e., $b-1\le x_3\le b$; Also, $0 \le x_3 \le 1$. Combining these two gives $\mathbb{max}(b-1,0) \le x_3\le \mathbb{min}(b,1)$
Note-2: $f_A(b-x_3)=2-b+x_3$ for $1\le b-x_3\le2$, i.e., $b-2\le x_3\le b-1$; Also, $0 \le x_3 \le 1$. Combining these two gives $\mathbb{max}(b-2,0) \le x_3\le \mathbb{min}(b-1,1)$
Looking at the bounds of $x_3$, it is reasonable to break the range of $b$ (i.e.,[$0,3$]) into $0\le b\le1$, $1\le b\le2$ and $2\le b\le3$ and considering the form of the pdf of B within each range separately.
Case: $0\le b\le1$: Range in Note-1 reduces to $0\le x_3\le b$; while that in Note-2 doesn't reduce to a feasible bound for $x_3$. Thus the pdf of $B$ reduces to
$f_B(b)=\int_0^b (b-x_3)\,dx_3=\frac{b^2}{2}$
Case: $1\le b\le2$: Range in Note-1 reduces to $b-1\le x_3\le 1$; while that in Note-2 reduces to $0\le x_3\le b-1$. Thus the pdf of $B$ reduces to
$f_B(b)=\int_{b-1}^1 (b-x_3)\,dx_3+\int_0^{b-1}(2-b+x_3)\,dx_3=\frac{-2b^2+6b-3}{2}$
Case: $2\le b\le3$: Range in Note-1 doesn't reduce to a feasible bound for $x_3$; while that in Note-2 reduces to $b-2\le x_3\le 1$. Thus the pdf of $B$ reduces to
$f_B(b)=\int_{b-2}^1(2-b+x_3)\,dx_3=\frac{(3-b)^2}{2}$
Can you now try similar logic for $n=4$ and $n=5$?
Although it may not be readily intuitive that the general form of the pdf of the sum of Uniform variates would tend to follow approximately Normal for large n, the pdf is a piecewise polynomial function of degree n-1. And, if you plot this function you'll end up with a plot of close to a pdf of Normal distribution (I tried in R) and it indeed goes to show how powerful the CLT is!
Best Answer
The distribution of the sum of two independent random variables is the convolution of the individual distribution. You can prove the formula for the PDF of the Irwin-Hall distribution by induction on $n$, the number of variables.