Confirming the definition of the infimum of a random variable

probabilityprobability theory

The infimum of a function is the greatest value such that all $y \in Y, f: X\rightarrow Y$ is greater or equal to.

Random variable is essentially a function that maps events (as in values in a normal function, or X) to a value on the real number line. So in my understanding the infimum of a r.v. is defined as the greatest value on the real number line that is less than all all the values that has being assigned to an event?

This definition then confuses me as I take into account of real distributions such as normal distribution. Specifically, what is the defined events here? If we consider a event as any number on the real line, we then assign a mapping $N: X\rightarrow X$, i.e. the event X is assigned the value of X (which is itself, and considered as the space named $X'$). Hence the infimum of the normal r.v. is a value smaller than all of $X' =X $, which is $-\infty$?

Best Answer

A random variable is a function, but it's a function from some sample space ($\Omega$) to the real number line. The sample space must also be equipped with a probability measure $(\mathbb P)$.

The typical use of $\inf$ in probability contexts is to have some indexed collection of random variables, not just a single random variable. Additionally, the infimum is usually indexed over the collection and not over the sample space $\Omega$. In this way, $\inf_i\{X_i\}$ is usually a random variable of its own right.

Example: consider a sequence of standard die rolls $X_1, X_2, X_3$, and set $Z = \inf\{X_1, X_2, X_3\}$. If we rolled a 3, 5, and a 4 -- that is, on the event $\{\omega : X_1(\omega) = 3, X_2(\omega) = 5, X_3(\omega) = 4\}$ -- we would have $Z(\omega) = 3$. Similarly, if $X_1 = X_2 = 6$ and $X_3 = 1$, we would have $Z = 1$.

The main reason it's not typically meaningful to index $\inf$ over the domain $\Omega$ (or equivalently, the range of the random variable itself) is because $\Omega$ is rarely explicitly defined at all. Let's use a die roll example; our sample space will be $\Omega = \{A, B, C, D, E, F, G\}$, and our probability measure will be $\mathbb P(A) = \mathbb P(B) = \dots = \mathbb P(F) = 1/6$ and $\mathbb P(G) = 0$. The random variable $X$ will map $A \to 1, B \to 2, C \to 3, D \to 4, E \to 5, F \to 6,$ and $G \to -100$. I claim that this perfectly models a fair 6-sided die roll, because the numbers 1 through 6 all have the appropriate probability.

If you try to use the infimum in the way that you're describing, the only reasonable approach will be something like the infimum of the range of $X$ -- but in this case, that would apparently be $-100$, unless you go out of your way to exclude measure-$0$ sets from $\Omega$. This definition would become immediately problematic when you include things like continuous random variables, where the domain $\Omega$ could be (for instance) the interval $(0, 1)$ with an appropriate mapping $X$, and every single point in the interval has measure 0. We almost never think too hard about a random variable as actually being a function in this way.

Related Question