Different definition of discrete random variable

definitionprobability theoryrandom variables

In our lecture the professor defines a discrete random variable by

Let be $(\Omega,\mathcal{F},p)$ a discrete probability space (where the $\mathcal{F}$ is simply the power set). Then, a function $X:\Omega\to\mathbb{R}$ is a discrete random variable.

I am wondering if this definition is too narrow or maybe wrong?

If we consider the uncountable probability space
$$
\left([0,1],\left\{\left[0,\frac{1}{2}\right),\left[\frac{1}{2},1\right],\left[0,1\right],\emptyset\right\},p\right)
$$

and a function
$$
Y:[0,1]\to\{0,1\}.
$$

Then, this function won't be a discrete random variable which would be kinda strange…

Is there any reason for defining a discrete random variable the way our professor did?

(See also Definition of Discrete Random Variable, but it doesn't fully answer my question)

Best Answer

The statement you give can be viewed as a theorem (if $p$ then $q$) rather than a definition of $q$:

Theorem: Fix probability space $(\Omega, \mathcal{F}, P)$. If $\Omega$ is a finite or countably infinite sample space then every random variable $X:\Omega\rightarrow \mathbb{R}$ must be a discrete random variable.

Proof: Let $X:\Omega\rightarrow\mathbb{R}$ be a random variable. If $\Omega$ is finite or countably infinite then the image $X(\Omega)$ is finite or countably infinite. $\Box$

The nice thing about random variables is that, once you know their distribution, you do not need to know the sample space. Most people would define a binary-valued Bernoulli random variable to be a discrete random variable, even if the sample space is uncountably infinite. If your professor is suggesting that all discrete random variables have finite or countably infinite sample spaces, then your professor is "cutting corners" for convenience. Such a definition is not satisfactory because it would never allow a probability space to have two random variables $X:\Omega\rightarrow\mathbb{R}$ and $Y:\Omega\rightarrow\mathbb{R}$ such that $X$ is discrete and $Y$ is continuous. You can see the professor is already cutting corners by assuming the sigma algebra is just the full power set. [This is not necessarily bad: Most of the basic probability concepts can be developed on finite or countably infinite sample spaces and using the power set sigma algebra. But one must use uncountably infinite sample spaces (and more sophisticated sigma algebras) when treating more advanced things such as continuous random variables and/or infinite sequences of random variables.]


There are various definitions of "discrete random variable" that are used: Let $(\Omega, \mathcal{F}, P)$ be a probability space (the sample space $\Omega$ can be uncountably infinite).

Definition 1: A random variable $X:\Omega\rightarrow\mathbb{R}$ is said to be discrete if the image $X(\Omega)$ is a finite or countably infinite set.

Definition 2 (not equivalent): A random variable $X:\Omega\rightarrow\mathbb{R}$ is said to be discrete if there is a finite or countably infinite subset $A \subseteq \mathbb{R}$ such that $P[X \in A]=1$.

It is easy to prove that if a random variable satisfies definition 1 then it also satisfies definition 2. Thus, the proof of the theorem I stated in the first part of my answer holds under either of these two definitions of "discrete random variable."

Of course, definition 2 is more general and it is possible for a random variable to satisfy definition 2 without satisfying definition 1: Consider the probability space $([0,1], \mathcal{B}([0,1]), \lambda)$ where $\lambda$ is the standard Borel measure (length) on the unit interval. Let $V$ be any uncountably infinite and Borel measurable subset of $[0,1]$ that has $\lambda(V)=0$. Define the random variable $X:\Omega\rightarrow\mathbb{R}$ by $$ X(\omega) = \left\{\begin{array}{cc} 0 & \mbox{ if $\omega < 1/2$ and $\omega \notin V$} \\ 1 & \mbox{ if $\omega\geq 1/2$ and $\omega \notin V$} \\ \omega & \mbox{ if $\omega \in V$} \end{array}\right.$$ Then $X(\Omega) = V \cup \{0,1\}$ is uncountably infinite so $X$ does not satisfy definition 1. But $X$ satisfies definition 2 with $A=\{0,1\}$. The distribution of $X$ is $Bernoulli(1/2)$.

Related Question