[Math] Random variable independent of $\sigma$-algebra and conditional expectation

measure-theoryprobability theory

What does it mean to say that a random variable is independent of a sigma-algebra, and why then does this imply that $E(RV| \sigma) = RV$?. I have no clue what this independence stuff is about (couldn't find it using google either), and surely $E(RV| \sigma) = RV$ seems like the logical definition of it?

Best Answer

A random variable $X$ is independent of a $\sigma$-algebra $\mathcal{G}$ if the $\sigma$-algebras $\sigma(X)$ and $\mathcal{G}$ are independent. Recall that $\sigma(X)$ consists of all sets of the form $\{X\in B\}$, where $B$ is a Borel subset of $\mathbb{R}$.

If $X$ has finite expectation and is independent of $\mathcal{G}$, then in fact $$ \mathbb{E}[X|\mathcal{G}]=\mathbb{E}[X]$$ On the other hand, if $X$ is $\mathcal{G}$-measurable (and has finite expectation) then $$ \mathbb{E}[X|\mathcal{G}]=X $$

This fits with the intuitive description of conditional expectation as an average based on the information contained in $\mathcal{G}$.