We can at least work out the distribution of two IID ${\rm Uniform}(0,1)$ variables $X_1, X_2$: Let $Z_2 = X_1 X_2$. Then the CDF is $$\begin{align*} F_{Z_2}(z) &= \Pr[Z_2 \le z] = \int_{x=0}^1 \Pr[X_2 \le z/x] f_{X_1}(x) \, dx \\ &= \int_{x=0}^z \, dx + \int_{x=z}^1 \frac{z}{x} \, dx \\ &= z - z \log z. \end{align*}$$ Thus the density of $Z_2$ is $$f_{Z_2}(z) = -\log z, \quad 0 < z \le 1.$$ For a third variable, we would write $$\begin{align*} F_{Z_3}(z) &= \Pr[Z_3 \le z] = \int_{x=0}^1 \Pr[X_3 \le z/x] f_{Z_2}(x) \, dx \\ &= -\int_{x=0}^z \log x dx - \int_{x=z}^1 \frac{z}{x} \log x \, dx. \end{align*}$$ Then taking the derivative gives $$f_{Z_3}(z) = \frac{1}{2} \left( \log z \right)^2, \quad 0 < z \le 1.$$ In general, we can conjecture that $$f_{Z_n}(z) = \begin{cases} \frac{(- \log z)^{n-1}}{(n-1)!}, & 0 < z \le 1 \\ 0, & {\rm otherwise},\end{cases}$$ which we can prove via induction on $n$. I leave this as an exercise.
It's false as you state it. Instead of the condition $X'_j\perp X_j$, you need that $X'_j\perp(X_i, i\ne j)$, so that $X_1, X_2, \dots, X'_j, \dots, X_n$ is also an i.i.d. collection.
Under that condition, the vectors
$(X_1, \dots, X_j, \dots, X_n)$ and
$(X_1, \dots, X'_j, \dots, X_n)$
have the same distribution. So if $g$ is any function, then
$W:=g(X_1, \dots, X_j, \dots, X_n)$ and $W':=g(X_1, \dots, X'_j, \dots, X_n)$ have the same distribution.
Now if you've got some further thing that "does not depend on the $j$th variable", say $Y=h(X_1, \dots, X_{j-1}, X_{j+1}, \dots, X_n)$, then
$W-Y$ and $W'-Y$ have the same distribution. This is just another application of the previous paragraph, since $W-Y$ is a function of $X_1,\dots,X_j,\dots, X_n$
and $W'-Y$ is "the same function" of $X_1,\dots, X'_j,\dots, X_n$.
However, if you had a second quantity that did depend on the $j$th coordinate, say
$Z=r(X_1, \dots, X_j, \dots, X_n)$, then you couldn't conclude that $W-Z$ and $W'-Z$ have the same distribution. In that case $W'-Z$ would be a function of all $n+1$ variables
$X_1,\dots,X_j, X'_j, \dots, X_n$.
Best Answer
Symmetry in mathematics occurs when a structure remains invariant under a set of operations or transformations.
In this case the argument is that $\mathsf E(X_i\mid X_1{+}{\cdots}{+}X_n=t)$ is constant for all values of $i$ in $\{1,\ldots,n\}$ , because each variable is independent and identically distributed.