If the range of a random variable is an interval, is its CDF continuous

probability theoryrandom variables

Let $X:\Omega\to\mathbb{R}$ be a random variable whose range is an interval. Does it follows that its CDF is continuous?

Many probability books define a continuous random variable to be a random variable which takes values in a "continuous scale," whatever that means. More rigorous books define it to be a random variable whose CDF is continuous. I'm trying to reconcile these two definitions.

Best Answer

Let $$F(x)=\begin{cases}0&\text{for }x<0\\\frac x 2&\text{for }0\le x <\frac12\\\frac{x+1}2&\text{for }\frac12\le x<1\\1&\text{for }x\ge 1.\end{cases}$$

This is the cdf of a random variable whose range is the interval $[0,1]$, but $F(x)$ is discontinuous as a function of $x$. The distribution of the random variable is a non-trivial convex combination of the uniform distribution on $[0,1]$, which is a continuous distribution, and the discrete point-mass at $x=\frac12$. Since the support of the discrete part is a subset of the interval supporting the continuous part, the range is an interval.