Properties of sigma-algebras generated by random variables

measure-theoryprobabilityrandom variables

Let $X_1,\dots,X_n$ be random variables. The sigma-algebra generated by $X_1$ is $\sigma(X_1):=\{X_1^{-1}(A) \rvert A \in \mathcal B(\Bbb R) \} $.

I wanted to know what properties we have about this notion. I can't find many results on internet. I know that

  • $\sigma(X_i)\subseteq \sigma(X_1,\dots,X_n)$
  • $\sigma(X_i+X_j)\subseteq \sigma(X_i,X_j)$ (never found a proof of it)
  • $\sigma(X_i)$ is not necessarily included in $\sigma(X_i+X_j)$

What other properties do we have ? Is $\sigma(\sum^nX_i)\subseteq \sigma(X_1,\dots,X_n)$ ? If $X,Y$ are independent do we have any nice properties ? I cannot find any documentation about it

Best Answer

A very useful (if not the most useful) characterization of $\sigma$-algebra generated by random variables is the following theorem, paraphrased from Theorem 20.1 in Billingsley's Probability and Measure:

Theorem 20.1 A random variable $Y$ is measurable w.r.t. $\sigma(X_1,\dots,X_n)$ iff thre exists a measurable function $f:\mathbb{R}^n\to\mathbb{R}$ such that $Y(\omega)=f(X_1,\dots,X_n)$.

So, to answer your question, $\sigma\left(\sum_{i=1}^nX_i\right)\subseteq \sigma(X_1,\dots,X_n)$ since the summation is certainly a measurable function. And in fact, the above characterization allows you to quickly determine whether a random variable is in a generated $\sigma$-algebra in many cases.

There are too many different things you can say about $\sigma$-algebra generated by independent random variables. You may take a look at $\S$4, $\S14$, $\S$20 in Billingsley, or you can rephrase your question to be more specific.