We can at least work out the distribution of two IID ${\rm Uniform}(0,1)$ variables $X_1, X_2$: Let $Z_2 = X_1 X_2$. Then the CDF is $$\begin{align*} F_{Z_2}(z) &= \Pr[Z_2 \le z] = \int_{x=0}^1 \Pr[X_2 \le z/x] f_{X_1}(x) \, dx \\ &= \int_{x=0}^z \, dx + \int_{x=z}^1 \frac{z}{x} \, dx \\ &= z - z \log z. \end{align*}$$ Thus the density of $Z_2$ is $$f_{Z_2}(z) = -\log z, \quad 0 < z \le 1.$$ For a third variable, we would write $$\begin{align*} F_{Z_3}(z) &= \Pr[Z_3 \le z] = \int_{x=0}^1 \Pr[X_3 \le z/x] f_{Z_2}(x) \, dx \\ &= -\int_{x=0}^z \log x dx - \int_{x=z}^1 \frac{z}{x} \log x \, dx. \end{align*}$$ Then taking the derivative gives $$f_{Z_3}(z) = \frac{1}{2} \left( \log z \right)^2, \quad 0 < z \le 1.$$ In general, we can conjecture that $$f_{Z_n}(z) = \begin{cases} \frac{(- \log z)^{n-1}}{(n-1)!}, & 0 < z \le 1 \\ 0, & {\rm otherwise},\end{cases}$$ which we can prove via induction on $n$. I leave this as an exercise.
Okay, let's first see why the first binary digit of $U$ is Bernoulli$(1/2)$. The first binary digit is $1$ if and only if $U \geq 1/2$, which has probability $1/2$, so we are done. For convenience, let $B_n$ denote the $n^{th}$ binary digit of $U$. Now, inductively assume that $B_1,\ldots,B_{n-1}$ are i.i.d. Bernoulli$(1/2)$. Then, look at the conditional probability $q_n:=\mathbb{P}(B_n=1\big|(B_1,\ldots,B_{n-1})=(b_1,\ldots,b_{n-1}))$, for a sequence $(b_1,\ldots,b_{n-1}) \in \{0,1\}^{n-1}$. Divide the interval $[0,1]$ into the diadic intervals of length $1/2^{n-1}$, and let these intervals be enumerated from left to right as $I_1,I_2,...I_{2^{n-1}}$. Now, what does the event $(B_1,\ldots,B_{n-1})=(b_1,\ldots,b_{n-1})$ say? It says that (is equal to the following event) $U$ must lie in exactly one of these diadic intervals, say $I_i$, where $i$ is a (complicated, but don't need to know) deterministic function of the deterministic binary sequence $(b_1,\ldots,b_{n-1})$. The way to find this interval is to follow a binary search algorithm, similar to the proof of the Heini-Borel theorem in real analysis.
Anyway, let $m_i$ be the midpoint of $I_i$. So,
$$q_n = \mathbb{P}(U > m_i|U\in I_i)~.$$ The above probability is obviously $1/2$. This shows that $B_n$ has a Bernoulli$(1/2)$ distribution, independent of $(B_1,\ldots,B_{n-1})$, and the induction is complete.
Best Answer
As you noted,
$$logY=\Sigma_i (-logX_i)$$
$$W_i=-logX_i\sim Exp(1)$$
Thus $log(Y)=\Sigma_i W_i\sim Gamma(n;1)$
Now, knowing the distribution of $logY$ it is easy to find the requested density
$$f_Y(y)=\frac{log^{n-1}y}{\Gamma(n)y^2}\cdot \mathbb{1}_{(1;+\infty)}(y)$$
P.S.: I used the notation $log$ but Natural logarithm is meant