Why is each component of a random vector a random variable

probabilityrandom variables

I am reading A Concise Introduction to Mathematical Statistics by Dragi Anevski and in chapter 5 about random vectors, he proves that each component of a random vector is a random variable. Unfortunately, I need some help with comprehending the authors proof:

"Suppose that $(\Omega, F, P)$ is a probability space and $X : \Omega \to \mathbb{R}^n$ is a random vector. We assume $(X_1,…,X_n)$ is a random vector and want to show that for any $1 \leq i \leq n$, $X_i$is a random variable. This follows since for every $x \in \mathbb{R}$, $$\{\omega : X_i(\omega) \leq x\} = \{\omega : X_i(\omega) \leq x\} \cap \Omega = \{\omega : X_i(\omega) \leq x\} \cap (\cap_{j \neq i}\{\omega : X_j(\omega) \leq \infty\}) = \{\omega : X_i(\omega) \leq x,\cap_{j \neq i}\{\omega : X_j(\omega) \leq \infty\}\}$$. And since this is in the sigma algebra, $F$, $X_i$ is a random variable.".

What I don't understand is the reformulation $\Omega = \cap_{j \neq i}\{\omega : X_j(\omega) \leq \infty\}$. Are you supposed to read it as 'the outcome space is equal to the intersection of all events that each $X_j$ for $j \neq i \in \{1,..,n\}$ gives rise to as $x \to \infty$'? Why is it an intersection and not a union?

I have not yet read about measure theory and looking at other answers it seems measure theory is a prerequisite.

I also dont understand the conclusion: why $$\{\omega : X_i(\omega) \leq x,\cap_{j \neq i}\{\omega : X_j(\omega) \leq \infty\}\} \in F$$.

I tried looking for an answer on the internet, on this website. Didn't find any other proof of it and has been searching for over two hours now.

Thank you in advance,
Isak

Best Answer

A correct proof is as follows: $$ X_i(\omega) \le x_i \Leftrightarrow (X_1(\omega), \dots, X_n(\omega)) \in \{(t_1, \dots, t_n) : t_i \le x_i\} =: B. $$ Therefore, $$ \{\omega : X_i(\omega) \le x_i\} = \{\omega : (X_1(\omega), \dots, X_n(\omega)) \in B\}. $$ Since $B$ is a Borel set, the above set belongs to F. Contrary to what the book says, one has to be familiar with measure theory to understand this.

To be precise, the Borel $\sigma$-algebra is the smallest $\sigma$-algebra that contains all open sets. It is also the smallest one containing all closed sets. Since $B$ is closed, it is a Borel set.

Also, at Jul 25 at 12:19 you asked if $A \subseteq \Omega$ and $\Omega \in F$ implies $A \in F$. The answer is no, since $A$ and $\Omega$ are sets of outcomes of some experiment, while $F$ is a set of subsets of $\Omega$, namely those to which probability is assigned. In this case $A$ happened to belong to $F$, but that had to be proved.

In my opinion, this book is not suitable for a first course in probability; your university library might help you to find some complement.

DonĀ“t despair

Bengt

Sorry, the above pertains to an alternative definition mentioned in a footnote. The one used in the book is:

$(X_1, \dots, X_n)$ is a random vector if $\{\omega : X_j(\omega) \le x_j,\, j = 1, \dots, n \} \in F$ for all real numbers $x_1, \dots, x_n$. Now, the proof is $$ \{\omega : X_1(\omega) \le x_1\} \\ =\{\omega : X_1(\omega) \le x_1, X_j(\omega) < \infty, \,j = 2, \dots, n\}\\ = \cup_{k=1}^\infty \{\omega : X_1(\omega) \le x_1, X_j(\omega) \le k, \,j = 2, \dots,n\} \in F, $$ since all sets in the union belong to $F$, and $F$ is a sigma-algebra.

Bengt