[Math] conditional expectation given 2 random variables

conditional-expectationmeasure-theoryprobabilityprobability theory

Let's say there are random variables $A$ and $B$ being independent.
And random variable $X$.

Are there any properties to simplify $E(X | (A,B))$ : expectation of $X$ given $A$ and $B$ ?

In particular, do we have a "simplification", like $E(X | (A,B)) = E( E(X|A) | B)$ ?

I think it's incorrect (take X = A*B) but it seems strange as intuitively this would seem to be correct, so I feel like there must be some formula linking $E(X | (A,B))$ and the conditional expectations given a single variable

thanks !

Best Answer

Notice that the random variable $\mathbb{E}(X|A)$ is $\sigma(A)$-measurable, so, being $\sigma(A)$ and $\sigma(B)$ independent: $$\mathbb{E}\left({\mathbb{E}}\left(X|A\right)|B\right)=\mathbb{E}\left({\mathbb{E}}\left(X|A\right)\right)=\mathbb{E}(X),$$ so it is $\mathbb{P}$-a.e. constant.

On the other hand, for example, if $X$ is $\sigma(A,B)$-measurable and it is not $\mathbb{P}$-a.e. constant, then: $$\mathbb{E}\left(X|\sigma(A,B)\right)=X\neq\mathbb{E}(X).$$

So, where did your intuition fall? When you get $\mathbb{E}(X|A)$, you get the best prediction knowing $A$ of $X.$ Now, when you get $\mathbb{E}(\mathbb{E}(X|A)|B)$, you get the best prediction knowing $B$ of [the best prediction knowing $A$ of $X$] and not the best prediction knowing $A$ and $B$ of $X$. While the first prediction has to be constant (because it is a prevision made on information independent on the quantity you are estimating, and so, knowing useless information, the best prediction you can make is to take the best prediction you can make when you know nothing at all, i.e. the expectation of the quantity) the second could be very well non-constant.