Weird definition of discrete random variable

random variables

My textbook, Introduction to Probability by Blitzstein and Hwang, gives the following definition of a discrete random variable (p. 94):

A random variable $X$ is said to be discrete if there is a finite list of values $a_1, a_2, \dots, a_n$ or an infinite list of values $a_1, a_2, \dots$ such that $P(X = a_j \ \text{for some $j$}) = 1$.

Initially, this definition seemed a bit weird to me. It's saying that a random variable $X$ is said to be discrete if the probability of the random variable being equal to some outcome $a_j$ in the sample space for any $j$ is equal to $1$; in other words, that it is certain that some outcome will occur. But would I be correct in presuming that what the author is likely attempting to accomplish by defining a discrete random variable in this way is to contrast it with the definition of a continuous random variable, since the case of a continuous random variable is one in which the probability that the random variable is equal to any outcome in the sample space is equal to $0$?

Best Answer

Your interpretation of what the author is doing is incorrect. A priori, a random variable $X$ is a function from the sample space $\Omega$ into the real numbers $\mathbb R$. The author is saying that, even though the function could take any real value, there is a distinguished finite set of values that it takes with probability $1$. This means that the set $\{a_1,a_2,\ldots,a_n\}$ has the property that the event $$ \bigl\{\omega\in\Omega\colon X(\omega)\in \{a_1,a_2,\ldots,a_n\}\bigr\} $$ is assigned a probability of $1$.

For example, you can consider the sample space $\Omega=\{0,1\}$ with equal probability assigned to each of the two values (i.e., flipping a fair coin). Then we have a random variable $X\colon \{0,1\}\to\mathbb R$ given by setting $X(0)=\pi$ and $X(1)=e$, for example. It satisfies the author's definition since one can choose $\{a_1,a_2\}=\{\pi,e\}$ and then the probability assigned to the event $\{\omega\colon X(w)\in \{\pi,e\}\}$ is $1$, even though there are other choices of $a_i$ such that the probability is either $0$ or $\tfrac 12$. For the definition to work, there just needs to be some set of finite values that captures all of the probability.

By the way, this definition should more accurately be called "finite range", since it does not cover some extremely famous examples of discrete random variables like a geometric random variable, for instance.

Related Question