Why does a cumulative distribution have the quality that $P(X<b)=\lim_{n\to\infty}\left[P\left(X\leqslant b-\frac1n\right)\right]$

cumulative-distribution-functionsprobabilityprobability theory

I came across the following passage in Ross' "First Course in Probability":

If we want to compute the probability that X is strictly less than b , we can apply the continuity property to obtain:
$$P\bigl(X<b\bigr)=\lim_{n\to\infty}\left[P\left(X\leqslant b-\frac1n\right)\right].$$

The continuity Ross is referring to, is the right continuity of the cumulative function but the property he mentions seems to me to be a property of a left continuity since the sequence $b-\dfrac1n$, where $n$ goes to infinity, is an increasing sequence that converges to $b$ from left to right.
Will be grateful for any enlightening remarks on this.

Best Answer

This is a consequence of continuity of measures. (As pointed out in a different answer, please consult Section 2.6 in Ross'.)

Continuity from below

Let $\mu$ be a measure on $(\mathcal{X},\mathcal{E})$, and let $E,E_1,E_2,\ldots$ be sets in $\mathcal{E}$. If $E_n \nearrow E$, then $\mu(E_n) \nearrow \mu(E)$.

Application

Let $(\Omega,\mathcal{F},\mathbb{P})$ be a background probability space, and let $X : (\Omega,\mathcal{F}) \to (\mathbb{R},\mathbb{B}(\mathbb{R}))$ be some random variable. Denote by $X(\mathbb{P})$ the pushforward measure on $(\mathbb{R},\mathbb{B}(\mathbb{R}))$ given by $X(\mathbb{P})(A) = P(X \in A)$ for $A\in\mathbb{B}(\mathbb{R})$.

Let $b\in\mathbb{R}$, let $B = (-\infty,b)$, and let $B_n = (-\infty,b-1/n]$ for $n\in\mathbb{N}$. Clearly, $B, B_1, B_2, \ldots$ are subsets of $\mathbb{B}(\mathbb{R})$. Note that $B_n \nearrow B$. Then according to the continuity from below of the probability measure $X(\mathbb{P})$, we find that $$ X(\mathbb{P})(B_n) \nearrow X(\mathbb{P})(B). $$ Rewriting according to the definition of the pushforward measure yields \begin{align*} \mathbb{P}(X \leq b-1/n) \nearrow \mathbb{P}(X < b) \end{align*} as desired.

Direct proof from axioms

Recall Kolmogorov's axioms: $\mathbb{P}$ is non-negative, $\mathbb{P}(\Omega)=1$, and if $E_1,E_2,\ldots$ is a sequence of mutually exclusive events, then \begin{align*} \mathbb{P}\!\left(\cup_{n=1}^\infty E_n\right) = \sum_{n=1}^\infty \mathbb{P}(E_n). \end{align*} This last axiom is also known as subadditivity.

Let $F_n = (X \leq b-1/n)$ for $n\in\mathbb{N}$. Note that $\cup_{n=1}^\infty F_n = (X < b)$. Define $E_n$ recursively by $E_1 = F_1$, $E_2 = F_2 \setminus F_1$, $E_3 = F_3 \setminus F_2$, and so on. For example, $F_2 = (b-1 < X \leq b-1/2)$. Note that $E_1,E_2,\ldots$ is a sequence of mutually exclusive events, and that $\cup_{n=1}^\infty E_n = \cup_{n=1}^\infty F_n = (X < b)$. Then by twice applying first countable subadditivity and then finite subadditivity (i.e. Kolmogorov's last axiom), \begin{align*} \mathbb{P}(X < b) &= \mathbb{P}\!\left(\cup_{n=1}^\infty E_n\right) \\ &= \sum_{n=1}^\infty \mathbb{P}(E_n) \\ &= \lim_{N\to\infty} \sum_{n=1}^N \mathbb{P}(E_n) \\ &= \lim_{N\to\infty} \mathbb{P}(\cup_{n=1}^N E_n) \\ &= \lim_{N\to\infty} \mathbb{P}(F_N) = \lim_{N \to \infty}\mathbb{P}(X \leq b - 1/N). \end{align*}