You assumed the $X_i$ are uniformly distributed in $[0,1]$ in the first place, so why are you later puzzled that "the $X_i$ all have the same range (in this case $0 \le X_i \le 1$ for all $i$)"?
If you add up $n$ numbers, each in the interval $[0,1]$, then you get a number in the range $[0,n]$.
There is no assumption of units (amount of fuel, number of passengers, etc.) here, but implicitly, writing down $X_1 + \cdots + X_n$ implies that for whatever physical quantity $X_i$ is supposed to model, the sum should make sense. Moreover the physical quantity should follow the probablistic assumption (uniform distribution): number of passengers does not make sense though, since presumably the number of passengers is a nonnegative integer, while $X_i$ takes on any value between $0$ and $1$.
Okay, let's first see why the first binary digit of $U$ is Bernoulli$(1/2)$. The first binary digit is $1$ if and only if $U \geq 1/2$, which has probability $1/2$, so we are done. For convenience, let $B_n$ denote the $n^{th}$ binary digit of $U$. Now, inductively assume that $B_1,\ldots,B_{n-1}$ are i.i.d. Bernoulli$(1/2)$. Then, look at the conditional probability $q_n:=\mathbb{P}(B_n=1\big|(B_1,\ldots,B_{n-1})=(b_1,\ldots,b_{n-1}))$, for a sequence $(b_1,\ldots,b_{n-1}) \in \{0,1\}^{n-1}$. Divide the interval $[0,1]$ into the diadic intervals of length $1/2^{n-1}$, and let these intervals be enumerated from left to right as $I_1,I_2,...I_{2^{n-1}}$. Now, what does the event $(B_1,\ldots,B_{n-1})=(b_1,\ldots,b_{n-1})$ say? It says that (is equal to the following event) $U$ must lie in exactly one of these diadic intervals, say $I_i$, where $i$ is a (complicated, but don't need to know) deterministic function of the deterministic binary sequence $(b_1,\ldots,b_{n-1})$. The way to find this interval is to follow a binary search algorithm, similar to the proof of the Heini-Borel theorem in real analysis.
Anyway, let $m_i$ be the midpoint of $I_i$. So,
$$q_n = \mathbb{P}(U > m_i|U\in I_i)~.$$ The above probability is obviously $1/2$. This shows that $B_n$ has a Bernoulli$(1/2)$ distribution, independent of $(B_1,\ldots,B_{n-1})$, and the induction is complete.
Best Answer
To better illustrate the following explanation, consider that the CDF of $R_n$ is represented by the figure below.
One important detail is that the definition of $X_n$ does not matter at all in this problem, so we will just ignore all information about the RV $X$.
To solve this problem, it is important to understand the definition of the random variable $Z_n$.
$Z_n$ is a (non-linear) transformation of $R_n$, so we will obtain the CDF of $Z_n$ from $R_n$ itself.
Let $F_{Z_n}(t) = P(Z_n \le t)$. denotes the CDF of $Z_n$ calculated in a point $t \in (0,1)$.
Now, take a fixed $t \in (0,1)$. Since $R_n$ is a continuous RV, exists a $r_t \in \mathbb{R}$ such $F_{R_n}(r_t) = t$.
Note that $Z_n = t$ is equivalent $R_n = r$ for an $r$ such to $F_{R_n}(r) = t$. In that way, $Z_n = t \iff R = r_t$.
For this reason:
$$F_{Z_n}(t) = P(Z_n \le t) = P(R_n \le r_t) = F_{R_n}(r_t) = t$$
Therefore, $F_{Z_n}(t) = t$, which implies that $Z_n \sim Unif(0,1)$.