Joint Distribution and Regular Conditional Probability Distribution —Durrett 4.1.12

conditional probabilityconditional-expectationprobability distributionsprobability theory

This is exercise 4.1.12 of Durrett $3^{rd}$, stating as follows:

Suppose $X$ and $Y$ have a joint density $f(x,y)>0$. Let $$\mu(y, A)=\dfrac{\int_{A}f(x,y)dx}{\int f(x,y)dx},$$ show that $\mu(Y(\omega), A)$ is a regular conditional distribution for $X$ given $\sigma(Y)$.

Durrett gives the definition of regular conditional probability as follows:

Let $(\Omega,\mathcal{F},\mathbb{R})$ be a probability space, $X:(\Omega,\mathcal{F})\longrightarrow (S,\mathcal{S})$ a measurable map and $\mathcal{G}\subset\mathcal{F}$ a $\sigma-$algebra. Then $\mu:\Omega\times S\longrightarrow [0,1]$ is said to be a regular conditional distribution for $X$ given $\mathcal{G}$ if

(1) For each $A$, $\omega\longrightarrow\mu(\omega, A)$ is a version of $\mathbb{P}(X\in A|\mathcal{G}).$

(2) For a.e. $\omega$, $A\longrightarrow\mu(\omega, A)$ is a probability measure on $(S,\mathcal{S})$.

I have some attempt to show the first point but got stuck.

Below is my attempt:

For each fixed $A$, write $g(\omega):=\mu(\omega, A)$, then $\mu(Y(\omega), A)=g(Y)$. We need to show $g(Y)$ is $\sigma(Y)-$measurable and $$\int_{B}g(Y)d\mathbb{P}=\int_{B}\mathbb{1}_{A}(X)d\mathbb{P},$$ for all $B\in\sigma(Y)$.

The first one is clear since for some $C\in\mathcal{R}$, $g^{-1}(C)\in\mathcal{S}$ and thus $Y^{-1}(g^{-1}(C))\in\sigma(Y)$.

But I have some problem with the second one. Let $B\in\sigma(Y)$, then $$B=\{\omega\in\Omega:Y(\omega)\in C\}\ \text{for some}\ C\in\mathcal{S},$$ so
\begin{align*}
\mathbb{E}(g(Y);B)=\mathbb{E}(g(Y)\mathbb{1}_{C}(Y))&=\int_{C}g(y)\mu(dy)\\
&=\int_{C}g(y)(\int f(x,y)dx)dy\\
&=\int_{C}\mu(y, A)\Big(\int f(x,y)dx\Big)dy\\
&=\int_{C}\int_{A}f(x,y)dxdy\\
&=\int_{C}\int \mathbb{1}_{A}f(x,y)dxdy,
\end{align*}

But the problem here is that I don't know how to show the last integral is $\mathbb{E}(\mathbb{1}_{A}(X))$ because of the set $C$, i.e. if there is no set restriction of $C$, we can easily compute $$\mathbb{E}(\mathbb{1}_{A}(X))=\int\mathbb{1}_{A}(x)\mu(dx)=\int\mathbb{1}_{A}(x)(\int f(x,y)dy)dx,$$ and then just use Fubini, we have $$RHS=\int\int\mathbb{1}_{A}(x)f(x,y)dydx=\int\int\mathbb{1}_{A}(x)f(x,y)dxdy,$$ but how could I add the restriction of $Y\in C$?

Also, it will be really appreciated if someone could tell me how to show the second criterion of regular conditional distribution.

Thank you!

Edit 1:

Okay I think I figured it out. I will answer my own post and leave it for a few days in case of any mistakes and further discussion, otherwise I will answer my own post.

Best Answer

Okay I got confused at the beginning and I believe I thought too much. Here is the proof:

Recall if $X$ and $Y$ have joint density $f(x,y)>0$ and $\mathbb{E}|g(X)|<\infty$, then $\mathbb{E}(g(X)|Y)=h(Y)$ where $$h(y)=\dfrac{\int g(x)f(x,y)dx}{\int f(x,y)dx}.$$

So if $g(x)=\mathbb{1}_{A}$, then $\mathbb{P}(X\in A|\mathcal{G})=\mathbb{E}(\mathbb{1}_{A}(X)|\mathcal{G})=h(Y),$ where $$h(y):=\dfrac{\int_{A}f(x,y)dx}{\int f(x,y)dx}.$$

So in this case, for each fixed $A$, if we identify $\mu(y, A)=h(y)$ so that $\mu(Y(\omega),A)=h(Y)$, we then check the first criterion of the regular conditional distribution.

To check the second one, the closure under countability additive is immediate since the integral in the numerator is closed under countable disjoint union of sets. Also, $\mu(\omega, A)\geq\mu(\omega,\varnothing)=0$ is immediate, again by the property of the integral. Finally it is clear that $\mu(\omega, S)=1$, since then the denominator and numerator are the same.

Thus, we checked the second criterion.

Related Question