[Math] Conditional expectation on more than one sigma-algebra

conditional probabilitymeasure-theoryprobability theoryrandom variables

I'm facing the following issue. Let $X$ be an integrable random variable on the probability space $(\Omega,\mathcal{F},\mathbb{P})$ and $\mathcal{G},\mathcal{H} \subseteq \mathcal{F}$ be two sigma-algebras. We assume that $X$ is independent of $\mathcal{G}$, i.e. $\sigma(X)$ is independent of $\mathcal{G}$. Can I say (and can I prove) that $$ E(X \mid \sigma(\mathcal{G} \cup \mathcal{H})) = E(X\mid \mathcal{H}) ?$$

Thank you very much for your help!

Best Answer

If you're willing to assume that $\sigma(\sigma(X)\cup \mathcal{H})$ is independent of $\sigma(\mathcal{G})$ and that $X$ is integrable, then the assertion is indeed true. We need to show that $$ E[X\mid \sigma(\mathcal{G}\cup\mathcal{H})]=E[X\mid\mathcal{H}], $$ that is, we need to show that $E[X\mid\mathcal{H}]$ can serve as the conditional expectation of $X$ given $\sigma(\mathcal{G}\cup\mathcal{H})$, i.e. show that

  • $E[X\mid\mathcal{H}]$ is $\sigma(\mathcal{G}\cup\mathcal{H})$-measurable,
  • $E[X\mid\mathcal{H}]$ is integrable,
  • for all $A\in\sigma(\mathcal{G}\cup\mathcal{H})$: $$\int_A E[X\mid\mathcal{H}]\,\mathrm dP=\int_A X\,\mathrm dP.$$

The first two are obvious. For the third, note that (by linearity, we can assume that $X$ is non-negative) $$ \sigma(\mathcal{G}\cup\mathcal{H})\ni A\mapsto \int_AE[X\mid\mathcal{H}]\,\mathrm dP $$ and $$ \sigma(\mathcal{G}\cup\mathcal{H})\ni A\mapsto \int_A X\,\mathrm dP $$ are two measures defined on $\sigma(\mathcal{G}\cup\mathcal{H})$ with equal total mass being $E[X]$. Hence, it is enough to show that the two measures are identical on some $\cap$-stable generator of $\sigma(\mathcal{G}\cup\mathcal{H})$. Here, we use that $$ \{A\cap B\mid A\in\mathcal{G},\,B\in\mathcal{H}\} $$ is indeed a $\cap$-stable generator of $\sigma(\mathcal{G}\cup\mathcal{H})$ (why?) and hence it suffices to show that $$ \int_{A\cap B} E[X\mid \mathcal{H}]\,\mathrm dP=\int_{A\cap B} X\,\mathrm dP,\quad A\in\mathcal{G},\,B\in\mathcal{H}. $$ Try to show this using the independence assumption. I think it will be clear to you that we in fact need the stronger independence assumption.

A counterexample showing that we indeed need the stronger assumption is the following: Let $U$ and $V$ be i.i.d. symmetric Bernoulli variables (i.e. $P(U=-1)=P(U=1)=\tfrac12$), $\mathcal{G}=\sigma(U)$, $\mathcal{H}=\sigma(V)$ and $X=UV$. Now, one can show that $X$ and $U$ are independent by showing that $$ P(X=a,U=b)=P(X=a)P(U=b) $$ for every combination of $a,b\in \{0,1\}$. But $\sigma(\sigma(X)\cup\sigma(V))$ is not independent of $\sigma(U)$ since for example $$ P(X=1,V=1,U=1)\neq P(X=1,V=1)P(U=1), $$ and hence we do not have the stronger independence assumption. In this case, $$ E[X\mid \sigma(\mathcal{G}\cup\mathcal{H})]=UV\neq E[U]V=E[X\mid\mathcal{H}]. $$

Related Question