$X,Y$ are independent iff conditional regular distribution of $X|Y$ is almost surely the same as the distribution of $X$.

measure-theory

I want to prove that $X,Y$ scalar random variables are independent iff conditional regular distribution of $X|Y$ is almost surely the same as the distribution of $X$. A simple result to prove for the… "regular" conditional distribution.

The definition employed for the regular distribution is the following.
Let $(\Omega,\mathcal F,P)$ be a probability space, and $(\Omega',\mathcal F')$ a measurable space. Now $\mu:\Omega\times\mathcal F'\rightarrow \mathbb R$ is the regular conditional distribution if:

1) For every $A\in \mathcal F'$, $\mu(.,A)=E(I_{X\in A}|\mathcal G)$ almost surely. $\mathcal G\subseteq \mathcal F$, and in this case $\mathcal G=\sigma (Y)$, which is the pre-image of the borel sigma algebra.

2) For almost every $\omega\in \Omega$, $\mu(\omega,.)$ is a probability measure.

This definition I find very hard to use in order to prove almost anything.

Best Answer

Take $A=(-\infty , x]$. If $X$ and $Y$ are independent then $\mu(\omega, A)=E(I_{(X \in A)} |Y)=P(X^{-1}(A))=P\{X\leq x\}$ so the conditional distribution coincides with teh distribution of $X$. Conversely if this condition holds integrate both sides over $\{Y\leq y\}$ to get $P\{X\leq x,Y\leq y\} =P\{X\leq x\}P\{Y\leq y\}$.