Prove that $\mathbb{E}[ \varphi(X, Y) | \mathcal{G}] = \psi(Y)$ where $\psi(y) := \mathbb{E}[ \varphi(X, y)]$

conditional-expectationmeasure-theoryprobability theory

$\newcommand{\diff}{\mathrm d}$

I'm trying to prove Proposition 12.4. (given without proof) in this note.

Let $\mathcal{G}$ be a sub-$\sigma$-field of $\mathcal{F}$ and $X, Y$ two random variables such that $X$ is independent of $\mathcal{G}$ and $Y$ is $\mathcal{G}$-measurable. Let $\varphi: \mathbb{R}^2 \rightarrow \mathbb{R}$ be Borel-measurable such that $\mathbb{E}[|\varphi(X, Y)|] < \infty$. Then
$$
\mathbb{E}[ \varphi(X, Y) | \mathcal{G}] = \psi(Y) \quad \text { a.s.} \quad \text{where} \quad \psi(y) := \mathbb{E}[ \varphi(X, y)].
$$

In below attempt, I'm stuck at showing
$$
\int_\mathbb R \left[ \int_\mathbb R \varphi (x,y) \diff \color{blue}{\mu'}(x) \right ] \mathrm d \nu'(y) = \int_\mathbb R \left[ \int_\mathbb R \varphi (x,y) \diff \color{blue}{\mu}(x) \right ] \mathrm d \nu'(y).
$$

Could you elaborate on how to finish the proof?


Proof Let $\mu, \nu$ be the distributions under $\mathbb P$ of $X, Y$ respectively. We have $X, Y$ are independent and thus the distribution $\lambda$ of $(X, Y)$ is the product measure of $\mu$ and $\nu$, i.e., $\lambda = \mu \otimes \nu$. By Fubini's theorem, $\varphi(X, y)$ is integrable for $\nu$-a.e. $y\in Y$ and the map $y \mapsto \mathbb{E}[ \varphi(X, y)]$ is Borel. Clearly, $\psi(Y)$ is $\mathcal G$-measurable.

Fix $A \in \mathcal G$. Let's prove that
$$
\int_A \varphi (X, Y) \diff \mathbb P = \int_A \psi(Y) \diff \mathbb P.
$$

Let $\mathcal G'$ be the sub-$\sigma$-algebra of $\mathcal G$ induced by $A$ and $\mathbb P'$ the restriction of $\mathbb P$ to $\mathcal G'$. We now consider integration w.r.t. $(A, \mathcal G', \mathbb P')$. Clearly,
$$
\int_A \varphi(X, Y) \diff \mathbb P = \int_A \varphi(X, Y) \diff \mathbb P'
\quad \text{and} \quad
\int_A \psi(Y) \diff \mathbb P = \int_A \psi(Y) \diff \mathbb P'.
$$

Notice that $X,Y$ are still independent under $(A, \mathcal G', \mathbb P')$. Let $\mu', \nu'$ be the distributions under $\mathbb P'$ of $X, Y$ respectively. Then the distribution of $(X, Y)$ under $\mathbb P'$ is $\lambda' :=\mu' \otimes \nu'$. By change of variables formula and Fubini's theorem,
$$
\int_A \varphi (X, Y) \diff \mathbb P' = \int_{\mathbb R^2} \varphi (x,y) \diff \lambda(x, y) = \int_\mathbb R \int_\mathbb R \varphi (x,y) \diff \mu'(x) \diff \nu'(y).
$$

By change of variables formula,
$$
\int_A \psi(Y) \diff \mathbb P' = \int_\mathbb R \psi(y) \diff \mathbb \nu'(y) = \int_\mathbb R \left[ \int_\Omega \varphi (X,y) \diff \mathbb P \right ] \mathrm d \nu'(y) = \int_\mathbb R \left[ \int_\mathbb R \varphi (x,y) \diff \mu(x) \right ] \mathrm d \nu'(y).
$$

Best Answer

For any $A\in \mathcal G$ we show that $E[1_A \varphi(X,Y)]=E[1_A \psi(Y)]$. This will establish that $E[\varphi(X,Y)|\mathcal G]=\psi(Y)$ a.s.

Let $Z:(\Omega,\mathcal F)\to (\Omega,\mathcal G)$, $\omega\mapsto \omega$ and note that $\mathcal G = \sigma(Z)$. Since $Y$ is $\mathcal G$-measurable, by the Doob–Dynkin lemma, there is some measurable $g:(\Omega,\mathcal G)\to (\mathbb R, \mathcal B(\mathbb R))$ such that $Y = g(Z)$.

Note that $A=Z^{-1}(A)$ thus

$$\begin{align} E[1_A \psi(Y)] &= E[1_{A}(Z) \psi(g(Z))] \\ &= \int_\Omega 1_A(z) \Big(\int_{\mathbb R} \varphi\big(x,g(z)\big) dP_X(x) \Big)dP_Z(z) \tag {1}\\ &= \int_{\mathbb R \times \Omega} 1_A(z) \varphi\big(x,g(z)\big) d(P_{X}\otimes P_{Z})(x,z) \tag {2}\\ &= \int_{\mathbb R \times \Omega} 1_A(z) \varphi\big(x,g(z)\big) dP_{(X,Z)}(x,z) \tag {3}\\ &= E[1_A(Z)\varphi\big(X,g(Z) \big)] \tag{4}\\ &= E[1_A\varphi\big(X,Y \big)] \end{align}$$

$(1)$: Law of the unconscious statistician (or integration w.r.t. pushforward measure) and $\psi:y\mapsto \int_{\mathbb R} \varphi(x,y) dP_X(x)$
$(2)$: Fubini's theorem
$(3)$: $X$ is independent of $\mathcal G$, hence $X$ and $Z$ are independent
$(4)$: Law of the unconscious statistician (or integration w.r.t. pushforward measure)

Related Question