I am a bit confused about the way continuous random variables are defined. For example, according to Wikipedia the exponential distribution, for $\lambda > 0,$ has a CDF equal to $F(x) = 1-e^{-\lambda x}$ (well, for non-negative $x$ anyway). So presumably it is possible to define a continuous random variable $X$ with $F$ as its CDF. But to be formal we need to define $X$ as a measurable function from a sample space $(\Omega, \Sigma_{\Omega}),$ with some probability measure $P,$ to the real numbers (with the Borel sigma algebra). So my question is how should $(\Omega, \Sigma_{\Omega}),$ $X$ and $P$ be chosen ? For example, would it be okay to choose $\Omega = [0, \infty),$ and $X$ as the inclusion function, and $P$ as the probability measure on $[0, \infty)$ with the aforementioned CDF ?
What probability space should I use with the exponential distribution
measure-theoryprobabilityprobability distributionsprobability theory
Related Solutions
A (real valued) random variable is just a measurable map $X : \Omega \to \Bbb{R}$, where $(\Omega, \mathcal{F}, \Bbb{P})$ is an arbitrary probability space.
What we can then do is to consider the push-forward measure $\Bbb{P}_X = X_\ast \Bbb{P}$ of $\Bbb{P}$ by $X$. This is sometimes called the distribution of $X$. By definition, we have
$$ X_\ast \Bbb{P} (E) = \Bbb{P}(X^{-1}(E)) = \Bbb{P}(X \in E), $$
for any (measurable) $E \subset \Bbb{R}$, so that (check this) $\Bbb{P}_X$ is a probability measure on $\Bbb{R}$. Note that the last expression is the one that most mathematicians in probability theory would use.
Now - as you already stated yourself - we can associate to every (locally finite) measure $\mu$ on $\Bbb{R}$ the distribution function $F = F_\mu$ of $\mu$, given by
$$ F_\mu (x) = \mu((-\infty, x]). $$
In this way, we can also associate to the measure $\Bbb{P}_X$ the distribution function $F_X = F_{\Bbb{P}_X}$ which satisfies
$$ F_X (a) = \Bbb{P}_X ((-\infty, a]) = \Bbb{P}(X \in (-\infty, a]) = \Bbb{P}(X \leq a). $$
Sometimes, this is also called the distribution of $X$ (note that we now call the measure $\Bbb{P}_X$ and it's distribution function $F_X = F_{\Bbb{P}_X}$ the "distribution of $X$". But as each of these two objects uniquely determines the other, this is not much of a problem).
Finally, all this has not much to do with the properties of $X$ as a function (i.e. with properties like continuity of $X$, ...). To see this, note that $\Omega$ is an arbitrary probability space. Hence, it does not make sense in general to talk about continuity of $X$, for example.
There is a different notion of a continuous random variable. Here, we call $X$ a continuous random variable, if the distribution function $F_X$ is continuous. This is equivalent to the condition $\Bbb{P}(X = a) = 0$ for all $a$ (why?) and thus has nothing to do with continuity of $X$ as a function (as above, this concept does not even make sense in general).
Short summary:
1) Each real-valued random variable comes with it's own cumulative distribution function. If we place additional assumptions on $X$, then it might be the case that this distribution function is given by the one associated to Lebesgue-measure. Note that we have to restrict Lebesgue-measure to (e.g.) an interval of length $1$ to do this, because otherwise this is no probability measure.
2) As explained above, the associated CDF is given by
$$ F_X (a) = \Bbb{P}(X \leq a). $$
The distribution or the law of a random variable $X$ is a probability measure $\mathcal L$ on $(\mathbb R,\mathcal R)$, where $\mathcal R$ is the Borel $\sigma$-algebra on $\mathbb R$, such that $\mathcal L:\mathcal R\to[0,1]$. The cumulative distribution function (CDF) of a random variable $X$ is the function $F_X:\mathbb R\to[0,1]$ such that $F_X(x)=\Pr\{X\le x\}$ for $x\in\mathbb R$. If we know the distribution of the random variable $X$, then we also know the CDF of the random variable $X$. It also true that the CDF uniquely determines the distribution of the random variable $X$ (see this question).
I hope this helps.
Best Answer
Given a univariate random variable $X$ with CDF $F$, the standard probability space on which we can construct $X$ is the unit interval $[0,1]$ equipped with Lebesgue measure. If we define $$ X(\omega):=F^{-1}(\omega),$$ where $F^{-1}$ is an appropriately constructed inverse function of the CDF, you can check that $X$ is measurable and possesses the required distribution. For the exponential distribution the inverse $F^{-1}$ can be determined unambiguously.
For a much more detailed discussion, see this.