You are right that you can think of this problem in terms of Poisson processes. The first arrival time after $t = 0$ as well as the inter-arrival times in a Poisson process of rate $1$ are independent $\text{Exp}(1)$
random variables and so $Y_1$, $Y_1 + Y_2$, $\ldots$, $Y_1 + Y_2 + \cdots + Y_{n+1}$ can be taken to be the times of the first, second, $\ldots$, $(n+1)$-th arrivals after $t = 0$ in the process. The random variables
$\frac{Y_1}{\sum_i^{n+1} Y_i}, \frac{Y_1+Y_2}{\sum_i^{n+1} Y_i}, \dots, \frac{Y_1+\dots+Y_n}{\sum_i^{n+1} Y_i}$ that you are looking at are the first $n$ arrival times "normalized" to a unit interval.
For $0 < t_1 < t_2 < \dots < t_n < 1$, the conditional probability
that there is one arrival in each interval $(t_i, t_i + \Delta t_i)$
and none in the remaining time of total length $(1 - \sum_i \Delta t_i)$
given that there are $n$ arrivals in $(0, 1)$ is approximately
$$
\begin{align*}
\frac{\exp(-(1 - \sum_i \Delta t_i)) \prod_{i=1}^n \exp(-\Delta t_i)\Delta t_i/1!}{\exp(-1)\frac{1^n}{n!}} &= n! \Delta t_1\Delta t_2 \cdots \Delta t_n\\
&= f_{U_{(1)}, \dots, U_{(n)}}(t_1, t_2, \ldots , t_n)\Delta t_1\Delta t_2 \cdots \Delta t_n
\end{align*}
$$
The best and simpler way to derive the distribution of the sum of two independent Uniform random variates is geometrical requiring working out some area calculations in 2-D.
However, the derivation of the distribution of $\sum_{i=1}^{n}X_i$ for $n>2$ with each $X_i$ independently distributed as $U(0,1)$ is generally tedious. Geometrically it is difficult to visualise for higher values of $n$. However, a convolution approach would be used to find them (that I will use here kind of recursively).
I start assuming you know the distribution of $A=X_1+X_2$ given by the below pdf:
$$f_A(a) = \begin{cases} a & \text{if $0 \le a \le 1$}\\ 2-a & \text{if $1 \le a \le 2$}\\ 0 & \text{elsewhere}\end{cases}$$
For $n=3$, define $B=X_1+X_2+X_3=A+X_3$. Note $0\le B\le3$. Now, by convolution of pdfs, the pdf of B: $f_B(b)=\int_{-\infty}^\infty f_{X_3}(x_3)f_A(b-x_3)\,dx_3$
Note-1: $f_A(b-x_3)=b-x_3$ for $0\le b-x_3\le1$, i.e., $b-1\le x_3\le b$; Also, $0 \le x_3 \le 1$. Combining these two gives $\mathbb{max}(b-1,0) \le x_3\le \mathbb{min}(b,1)$
Note-2: $f_A(b-x_3)=2-b+x_3$ for $1\le b-x_3\le2$, i.e., $b-2\le x_3\le b-1$; Also, $0 \le x_3 \le 1$. Combining these two gives $\mathbb{max}(b-2,0) \le x_3\le \mathbb{min}(b-1,1)$
Looking at the bounds of $x_3$, it is reasonable to break the range of $b$ (i.e.,[$0,3$]) into $0\le b\le1$, $1\le b\le2$ and $2\le b\le3$ and considering the form of the pdf of B within each range separately.
Case: $0\le b\le1$: Range in Note-1 reduces to $0\le x_3\le b$; while that in Note-2 doesn't reduce to a feasible bound for $x_3$. Thus the pdf of $B$ reduces to
$f_B(b)=\int_0^b (b-x_3)\,dx_3=\frac{b^2}{2}$
Case: $1\le b\le2$: Range in Note-1 reduces to $b-1\le x_3\le 1$; while that in Note-2 reduces to $0\le x_3\le b-1$. Thus the pdf of $B$ reduces to
$f_B(b)=\int_{b-1}^1 (b-x_3)\,dx_3+\int_0^{b-1}(2-b+x_3)\,dx_3=\frac{-2b^2+6b-3}{2}$
Case: $2\le b\le3$: Range in Note-1 doesn't reduce to a feasible bound for $x_3$; while that in Note-2 reduces to $b-2\le x_3\le 1$. Thus the pdf of $B$ reduces to
$f_B(b)=\int_{b-2}^1(2-b+x_3)\,dx_3=\frac{(3-b)^2}{2}$
Can you now try similar logic for $n=4$ and $n=5$?
Although it may not be readily intuitive that the general form of the pdf of the sum of Uniform variates would tend to follow approximately Normal for large n, the pdf is a piecewise polynomial function of degree n-1. And, if you plot this function you'll end up with a plot of close to a pdf of Normal distribution (I tried in R) and it indeed goes to show how powerful the CLT is!
Best Answer
A simple argument by induction will also work. If we define $\{x\} = x - \lfloor x \rfloor$ as the fractional part of $x$, then proving that $\{U_1 + U_2\} \sim U \sim \operatorname{Uniform}(0,1)$ will then allow you to inductively show the main result, because $$\{U_1 + \cdots + U_n\} = \{\{U_1 + \cdots + U_{n-1} \} + U_n \}$$ and each $U_1, \ldots, U_n$ are iid. So all that remains is to prove the base case, which is quite straightforward as an explicit integral. First, the distribution of $S_2 = U_1 + U_2$ is easily calculated: $$f_{S_2}(s) = \begin{cases} s, & 0 \le s \le 1 \\ 2-s, & 1 < s \le 2. \end{cases}.$$ Then the CDF of $\{S_2\}$ is obtained by noting $$ F_{\{S_2\}}(s) = \Pr[\{S_2\} \le s] = \int_{u=0}^s f_{S_2}(u) \, du + \int_{u=1}^{1+s} f_{S_2}(u) \, du = s.$$