Prove $\mathbb{E}(X|Y,Z) = E(X|Y)$ a.s. given that $Z$ is independent of $Y$ and of $X$.

conditional-expectationprobability theory

I am stuck trying to prove the following statement involving conditional expectation of random variables:

Suppose $Z$ is independent of $Y$ and of $X$, and $E|X|<\infty$, then $\mathbb{E}(X|Y,Z) = E(X|Y)$ a.s..

Here's my attempt: by definition of conditional expectation, it suffices to show the following two conditions:

  1. $\int_A \mathbb{E}(X|Y,Z) dP=\int_A XdP $ for all $A\in \sigma(Y)$.
  2. $\mathbb{E}(X|Y,Z)$ is $\sigma(Y)$ measurable.

The first condition is easy, as for every $A\in \sigma(Y)$, $A\in \sigma(Y,Z)$. Thus by definition it automatically follows that $\int_A \mathbb{E}(X|Y,Z) dP=\int_A XdP $.

However, for the second condition, I have little clue as to what to do. In particular, I haven't figure out a way to utilize the independence assumption to prove measurability. I would appreciate thoughts on whether this is the right approach.

Best Answer

This is false. Let $A,B,C$ be pair-wise independent but not jointly independent. Take $X=I_A,Y=I_B$ and$Z=I_C$. Then the the integral of LHS over $Y^{-1}(\{1\}) \cap Z^{-1}(\{1\})$ is $P(A\cap B \cap C)$ whereas RHS is $P(A)$ so the integral of RHS over $Y^{-1}(\{1\}) \cap Z^{-1}(\{1\})$ is $P(A)P(B \cap C)=P(A)P(B)P(C)$.