Linear combination of a family of independent random variables

measure-theoryprobabilityprobability theory

(From Achim Klenke – "Probability Theory") The following theorem holds:

Let $K$ be an arbitrary set and $I_k, k \in K$, arbitrarily mutually disjoint index sets. Define $I=\bigcup_{k\in K} I_k$. If the family $(X_i)_{i\in I}$ is independent, then the family of $\sigma$-algebras $(\sigma(X_j, j \in I_k))_{k\in K})$ is independent.

Now consider the following example (also from the mentioned book):

If $(X_n)_{n\in\mathbb{N}}$ is an indepenedent family of real random variables, then also $(Y_n)_{n\in\mathbb{N}}=(X_{2n}-X_{2n-1})_{n\in\mathbb{N}}$ is independent. Indeed, for any $n\in \mathbb{N}$ the random variable $Y_n$ is $\sigma(X_{2n}, X_{2n-1})$-measurable. (This is concluded from the fact that a linear combination of measurable maps is again measurable). It follows from the stated theorem that $(\sigma(X_{2n}, X_{2n-1}))_{n\in\mathbb{N}}$ is independent.

Now there are two issues: The book does not really define what $\sigma(X_j, j \in I)$ actually is. I assume it is the smallest $\sigma$-algebra, for which $(X_j)_{j\in I}$ are measurable, is this correct?

Second issue: If we know that $(Y_n)_{n\in\mathbb{N}}$ are measurable, how can we conclude that they are $(\sigma(X_{2n}, X_{2n-1}))_{n\in\mathbb{N}}$-measurable? It seems intuitive at first, but the more I think about it, the less it becomes.

And at last: Are there random variables $Z_j$ which are a function of independent random variables $X_i$ with $Z=f(X_1,X_2…)$ where $Z$ is measurable, but not $\sigma(X_1,X_2,…)$ measurable?

Best Answer

First of all: I guess, Klenke defines random variables to be objects mapping from some probability space $(\Omega, \mathcal{A}, P)$ into $(\mathbb{R},\mathcal{B}\mathbb{R})$.

  1. Exactly, $\sigma(X_j : j \in I)$ is the $\sigma$-algebra generated by the vector valued map $$\Omega \ni \omega \mapsto (X_j : j \in I) \in \mathbb{R}^{I},$$ where $\mathbb{R}^{I}$ is equipped with the Borel-$\sigma$-algebra $\mathcal{B}\mathbb{R}^{I}$.

  2. $X_{2n}$ and $X_{2n-1}$ are both $\sigma(X_{2n},X_{2n-1})$-$\mathcal{B}\mathbb{R}$ measurable. Since $\mathcal{B}\mathbb{R}$ is a Borel-$\sigma$-algebra, $Y_n := X_{2n} - X_{2n-1}$ is measurable on the exact same space; as you said before, as a linear combination of measurable functions.

  3. The last question is a bit confusing. I guess, it should read

    And at last: Are there random variables $Z$ which are a function of independent random variables $X_i$ with $Z=f(X_1,X_2...)$ where $Z$ is $\mathcal{B}\mathbb{R}$-$\mathcal{B}\mathbb{R}$-measurable, but not $\sigma(X_1,X_2,...)$-$\mathcal{B}\mathbb{R}$-measurable?

That is a somewhat harder question. It is definitely not possible, if $f$ is $\mathcal{B}\mathbb{R}^I$-$\mathcal{B}\mathbb{R}$-measurable. The Doob Dynkin Lemma may help you.