A (real valued) random variable is just a measurable map $X : \Omega \to \Bbb{R}$, where $(\Omega, \mathcal{F}, \Bbb{P})$ is an arbitrary probability space.
What we can then do is to consider the push-forward measure $\Bbb{P}_X = X_\ast \Bbb{P}$ of $\Bbb{P}$ by $X$. This is sometimes called the distribution of $X$. By definition, we have
$$
X_\ast \Bbb{P} (E) = \Bbb{P}(X^{-1}(E)) = \Bbb{P}(X \in E),
$$
for any (measurable) $E \subset \Bbb{R}$, so that (check this) $\Bbb{P}_X$ is a probability measure on $\Bbb{R}$. Note that the last expression is the one that most mathematicians in probability theory would use.
Now - as you already stated yourself - we can associate to every (locally finite) measure $\mu$ on $\Bbb{R}$ the distribution function $F = F_\mu$ of $\mu$, given by
$$
F_\mu (x) = \mu((-\infty, x]).
$$
In this way, we can also associate to the measure $\Bbb{P}_X$ the distribution function $F_X = F_{\Bbb{P}_X}$ which satisfies
$$
F_X (a) = \Bbb{P}_X ((-\infty, a]) = \Bbb{P}(X \in (-\infty, a]) = \Bbb{P}(X \leq a).
$$
Sometimes, this is also called the distribution of $X$ (note that we now call the measure $\Bbb{P}_X$ and it's distribution function $F_X = F_{\Bbb{P}_X}$ the "distribution of $X$". But as each of these two objects uniquely determines the other, this is not much of a problem).
Finally, all this has not much to do with the properties of $X$ as a function (i.e. with properties like continuity of $X$, ...). To see this, note that $\Omega$ is an arbitrary probability space. Hence, it does not make sense in general to talk about continuity of $X$, for example.
There is a different notion of a continuous random variable. Here, we call $X$ a continuous random variable, if the distribution function $F_X$ is continuous. This is equivalent to the condition $\Bbb{P}(X = a) = 0$ for all $a$ (why?) and thus has nothing to do with continuity of $X$ as a function (as above, this concept does not even make sense in general).
Short summary:
1) Each real-valued random variable comes with it's own cumulative distribution function. If we place additional assumptions on $X$, then it might be the case that this distribution function is given by the one associated to Lebesgue-measure. Note that we have to restrict Lebesgue-measure to (e.g.) an interval of length $1$ to do this, because otherwise this is no probability measure.
2) As explained above, the associated CDF is given by
$$
F_X (a) = \Bbb{P}(X \leq a).
$$
The function $F^{-1+}$ is continuous from the right. To see this let
$y_{0}$ be such that $x_{0}:=F^{-1+}(y_{0})\in\mathbb{R}$ and consider a
sequence $y_{n}\searrow y_{0}$. Set $F^{-1+}(y_{n}):=x_{n}$. Since $F^{-1+}$
is non-decreasing, $F^{-1+}(y_{0})\leq F^{-1+}(y_{n+1})\leq F^{-1+}(y_{n})$
and so $x_{0}\leq x_{n+1}\leq x_{n}$. It follows that $x_{n}\searrow x$ with
$x_{0}\leq x$.
Since $x_{n}=F^{-1+}(y_{n})=\sup\{x:\,F(x)\leq y_{n}\}$ for every
$\varepsilon>0$ we have $F(x_{n}-\varepsilon)\leq y_{n}<F(x_{n}+\varepsilon)$.
If by contradiction $x_{0}<x$, then taking $\varepsilon:=\frac{x-x_{0}}{2}$ we
have that $x_{n}-\varepsilon>x_{0}+\varepsilon$ and so by monotonicity
$y_{n}\geq F(x_{n}-\varepsilon)\geq F(x_{0}+\varepsilon)>y_{0}$. Letting
$n\rightarrow\infty$ and using the fact that $y_{n}\searrow y_{0}$ we get a
contradiction. This shows that $x_{n}\searrow x_{0}$, that is, that $F^{-1+}$
is continuous from the right.
That $F^{-1+}$ admits a limit from the left follows from the fact that
$F^{-1+}$ is non-decreasing. So (2) holds with right continuous in place of left.
Best Answer
Well, if $Q(p)$ is well-defined and monotonic in the interval $(0,1)$, then certainly.