If you are studying elementary probability theory, allow me to reformulate your question as "how can I represent a random variable $X$ with a given CDF $F_X$ in terms of a uniform random variable $U$ on $(0,1)$?" The answer to that is the quantile function: you define
$$G_X(p)=\inf \{ x : F_X(x) \geq p \}$$
and then define $X$ to be $G_X(U)$.
Note that if $F_X$ is invertible then $G_X=F_X^{-1}$, otherwise this is "the right generalization". One can see this by looking at the discrete case: if $P(X=x)=p$ then $P(G_X(U)=x)=p$. This is because a jump of height $p$ in $F_X$ corresponds to a flat region of length $p$ in $G_X$, and the uniform distribution on $(0,1)$ assigns each interval a probability equal to its length.
The natural question is now "what's a uniform random variable on $(0,1)$?" Well, it has $F_U(x)=\begin{cases} 0 & x<0 \\ x & x \in [0,1] \\ 1 & x>1 \end{cases}$. But otherwise such a thing is a black box from the elementary point of view.
If you are studying measure-theoretic probability theory then the answer is a bit more explicit. A random variable with CDF $F_X$ is given by $G_X : \Omega \to \mathbb{R}$ where $G_X$ is the quantile function as defined before, $\Omega=(0,1)$, $\mathcal{F}$ is the Borel $\sigma$-algebra on $(0,1)$, and $\mathbb{P}$ is the Lebesgue measure. Note that on this space the identity function is a uniform random variable on $(0,1)$, so this is really the same construction as the one described above.
In any case these constructions can be generalized to finitely many random variables by looking at the uniform distribution on $(0,1)^n$ instead of $(0,1)$.
Let
$$F(x)=\begin{cases}0&\text{for }x<0\\\frac x 2&\text{for }0\le x <\frac12\\\frac{x+1}2&\text{for }\frac12\le x<1\\1&\text{for }x\ge 1.\end{cases}$$
This is the cdf of a random variable whose range is the interval $[0,1]$, but $F(x)$ is discontinuous as a function of $x$. The distribution of the random variable is a non-trivial convex combination of the uniform distribution on $[0,1]$, which is a continuous distribution, and the discrete point-mass at $x=\frac12$. Since the support of the discrete part is a subset of the interval supporting the continuous part, the range is an interval.
Best Answer
To acquire a solid comprehension of the technical definition of a random variable requires measure theory, which does require sigma algebras.
I will try to give a relatively non-technical definition. First note that we require a set $\Omega$, called the sample space, which roughly contains everything that could possibly happen in our experiment. The elements $\omega \in \Omega$ are the individual outcomes that can occur.
A random variable is a function from $\Omega$ to $\mathbb R$ that has a special property that helps make rigorous probability theory work. (This special property is called being measurable, which you can look up if you want to.) So given any outcome of the experiment $\omega$, you get $X(\omega)$, which is a real number.
A CDF is a function $F(c) = P\big(\{\omega \in \Omega \colon X(\omega) \leq c \} \big)$, or more informally $P(X \leq c$), that gives the probability that the random variable $X$ is less than or equal to $c$. Specifically, the CDF is a function of $c$. (In formal probability theory, the CDF is the fundamental object from which pdfs and pmfs are derived.)
Aside: note that there is no randomness in the definition of a random variable. It's just a function.