If you are studying elementary probability theory, allow me to reformulate your question as "how can I represent a random variable $X$ with a given CDF $F_X$ in terms of a uniform random variable $U$ on $(0,1)$?" The answer to that is the quantile function: you define
$$G_X(p)=\inf \{ x : F_X(x) \geq p \}$$
and then define $X$ to be $G_X(U)$.
Note that if $F_X$ is invertible then $G_X=F_X^{-1}$, otherwise this is "the right generalization". One can see this by looking at the discrete case: if $P(X=x)=p$ then $P(G_X(U)=x)=p$. This is because a jump of height $p$ in $F_X$ corresponds to a flat region of length $p$ in $G_X$, and the uniform distribution on $(0,1)$ assigns each interval a probability equal to its length.
The natural question is now "what's a uniform random variable on $(0,1)$?" Well, it has $F_U(x)=\begin{cases} 0 & x<0 \\ x & x \in [0,1] \\ 1 & x>1 \end{cases}$. But otherwise such a thing is a black box from the elementary point of view.
If you are studying measure-theoretic probability theory then the answer is a bit more explicit. A random variable with CDF $F_X$ is given by $G_X : \Omega \to \mathbb{R}$ where $G_X$ is the quantile function as defined before, $\Omega=(0,1)$, $\mathcal{F}$ is the Borel $\sigma$-algebra on $(0,1)$, and $\mathbb{P}$ is the Lebesgue measure. Note that on this space the identity function is a uniform random variable on $(0,1)$, so this is really the same construction as the one described above.
In any case these constructions can be generalized to finitely many random variables by looking at the uniform distribution on $(0,1)^n$ instead of $(0,1)$.
Best Answer
Here's a sketch for the continuous case:
For any nonnegative continuous real random variable $X$ and any integer $r\ge 1$, $$X^r = \int_0^X rx^{r-1}dx = \int_0^\infty r x^{r-1}[X>x]dx $$ where
$$[X>x] = \begin{cases} 1 & \text{if }X>x \\ 0 & \text{if }X\le x. \end{cases}$$ Therefore, using Tonelli's theorem and the fact that $E[X>x] = P(X>x)$, $$E (X^r) = r \int_0^\infty x^{r-1}P(X>x)dx. $$ Now, for any continuous random variable $X$ (not necessarily nonnegative), we have $X = Y - Z$, where $Y=X^+$ and $Z=X^-$ are the positive and negative parts of $X$. Then, since $YZ = 0$, the Binomial Theorem gives $$X^r = (Y - Z)^r = Y^r +(-Z)^r $$ and because both $Y$ and $Z$ are nonnegative random variables, $$\begin{align} E(X^r) &= E(Y^r) + (-1)^rE(Z^r)\\ &=r\int_0^\infty y^{r-1}P(Y>y)dy + (-1)^r r\int_0^\infty z^{r-1}P(Z>z)dz\\ &=r\int_0^\infty y^{r-1}P(X>y)dy + (-1)^r r\int_0^\infty z^{r-1}P(X<-z)dz\\ &=r\int_0^\infty x^{r-1}\big(P(X>x) + (-1)^r P(X<-x)\big)dx\\ &=r\int_0^\infty x^{r-1}\big(1-F(x) + (-1)^r F(-x)\big)dx.\\ \end{align} $$