Is “conditional independence” of $\sigma$-algebras implied by “set-wise conditional independence” of $\sigma$-algebras

conditional-expectationprobability theory

Let $(\Omega,\mathcal{F},\mathbb{P})$ be a probability space, let $\mathcal{G}_1,\mathcal{G}_2,\mathcal{H}$ be sub-$\sigma$-algebras of $\mathcal{F}$, and suppose that for all $G_1 \in \mathcal{G}_1$, $G_2 \in \mathcal{G}_2$ and $H \in \mathcal{H}$ with $\mathbb{P}(H)>0$, $G_1$ and $G_2$ are conditionally independent given $H$. Does it follow that
$$ \mathbb{P}(G_1 \cap G_2 | \mathcal{H}) \overset{\mathbb{P}\textrm{-a.s.}}{=} \mathbb{P}(G_1|\mathcal{H})\mathbb{P}(G_2|\mathcal{H}) $$
for all $G_1 \in \mathcal{G}_1$ and $G_2 \in \mathcal{G}_2$?

More extenstive discussion of the problem, and my attempts so far, now follows.



Fix a probability space $(\Omega,\mathcal{F},\mathbb{P})$ and sub-$\sigma$-algebras $\mathcal{G}_1,\mathcal{G}_2,\mathcal{H}$ of $\mathcal{F}$.

Definition of conditional independence:

We say that $\mathcal{G}_1$ and $\mathcal{G}_2$ are conditionally $\mathbb{P}$-independent given $\mathcal{H}$ if the following equivalent statements [1, Sec. 3.2, p131] hold:

  • for all $G_1 \in \mathcal{G}_1$ and $G_2 \in \mathcal{G}_2$, $\ \mathbb{P}(G_1 \cap G_2 | \mathcal{H}) \overset{\mathbb{P}\textrm{-a.s.}}{=} \mathbb{P}(G_1|\mathcal{H})\mathbb{P}(G_2|\mathcal{H}) \,$;
  • for all $G_1 \in \mathcal{G}_1$, there is an $\mathcal{H}$-measurable version of $\mathbb{P}(G_1|\sigma(\mathcal{G}_2 \cup \mathcal{H}))$;
  • for all $G_2 \in \mathcal{G}_2$, there is an $\mathcal{H}$-measurable version of $\mathbb{P}(G_2|\sigma(\mathcal{G}_1 \cup \mathcal{H}))$.

[1] M. M. Rao, Probability Theory with Applications, Academic Press, Inc., 1984.

The heuristic interpretation is: "If I have the knowledge of $\mathcal{H}$, the whole of $\mathcal{H}$ and nothing but $\mathcal{H}$, then from this standpoint $\mathcal{G}_1$ and $\mathcal{G}_2$ are independent."

Note that conditional independence of $\mathcal{G}_1$ and $\mathcal{G}_2$ given $\mathcal{H}$ neither implies nor is implied by independence of $\mathcal{G}_1$ and $\mathcal{G}_2$.


Definition of "set-wise conditional independence":

I will say that $\mathcal{G}_1$ and $\mathcal{G}_2$ are set-wise conditionally $\mathbb{P}$-independent given $\mathcal{H}$ if
$$ \hspace{23mm} \mathbb{P}(G_1 \cap H)\mathbb{P}(G_2 \cap H) \ = \ \mathbb{P}(G_1 \cap G_2 \cap H)\mathbb{P}(H) \hspace{20mm} (\ast) $$
for all $G_1 \in \mathcal{G}_1$, $G_2 \in \mathcal{G}_2$ and $H \in \mathcal{H}$.

For any $E \in \mathcal{F}$ with $\mathbb{P}(E)>0$, define the probability measure $\mathbb{P}_E$ by $\mathbb{P}_E(A)=\frac{\mathbb{P}(A \cap E)}{\mathbb{P}(E)}$. Then the formula $(\ast)$ is equivalent to the statement, "if $\mathbb{P}(H)>0$ then $G_1$ and $G_2$ are $\mathbb{P}_H$-independent".

Note that if $\mathcal{G}_1$ and $\mathcal{G}_2$ are set-wise conditionally $\mathbb{P}$-independent given $\mathcal{H}$, then in particular $\mathcal{G}_1$ and $\mathcal{G}_2$ are $\mathbb{P}$-independent. The heuristic interpretation of set-wise conditional independence is meant to be: "$\mathcal{G}_1$ and $\mathcal{G}_2$ are independent, and no amount of knowledge from $\mathcal{H}$ will change this".

Remark. If $\mathcal{G}_1$ and $\mathcal{G}_2$ are set-wise conditionally $\mathbb{P}$-independent given $\mathcal{H}$, then for any $H_0 \in \mathcal{H}$ with $\mathbb{P}(H_0)>0$, $\mathcal{G}_1$ and $\mathcal{G}_2$ are set-wise conditionally $\mathbb{P}_{H_0}$-independent given $\mathcal{H}$.


THE QUESTION.

Given what the intuition behind set-wise conditional independence is meant to be, a natural question is:

Are the following statements equivalent?

  • $\mathcal{G}_1$ and $\mathcal{G}_2$ are set-wise conditionally $\mathbb{P}$-independent given $\mathcal{H}$.
  • For every sub-$\sigma$-algebra $\tilde{\mathcal{H}}$ of $\mathcal{H}$, $\mathcal{G}_1$ and $\mathcal{G}_2$ are conditionally $\mathbb{P}$-independent given $\tilde{\mathcal{H}}$.

This question easily simplifies to:

Does set-wise conditional independence given $\mathcal{H}$ imply conditional independence given $\mathcal{H}$?

[To see this: If set-wise conditional independence given $\mathcal{H}$ is equivalent to conditional independence under every sub-$\sigma$-algebra of $\mathcal{H}$, then in particular set-wise conditional independence given $\mathcal{H}$ implies conditional independence given $\mathcal{H}$. Conversely, suppose set-wise conditional independence implies conditional independence. In the one direction, obviously set-wise conditional independence given $\mathcal{H}$ implies set-wise conditional independence — and hence conditional independence — given every sub-$\sigma$-algebra of $\mathcal{H}$. In the other direction, set-wise conditional independence given $\mathcal{H}$ is verified set-wise as a consequence of conditional independence given $\sigma(\{H\})$ for each $H \in \mathcal{H}$.]

Intuitively, I think the answer ought to be yes. And yet, I'm having trouble proving it.

(One special case where the answer should clearly be yes is when $\mathcal{H}$ has a generator consisting of countably many mutually disjoint sets; but I am interested in the general case.)


A further characterisation of the question:

Let us say that $\mathcal{G}_1$ and $\mathcal{G}_2$ have uncorrelated conditional probabilities under $\mathbb{P}$ given $\mathcal{H}$ if for all $G_1 \in \mathcal{G}_1$ and $G_2 \in \mathcal{G}_2$ the random variables $\mathbb{P}(G_1|\mathcal{H})$ and $\mathbb{P}(G_2|\mathcal{H})$ are uncorrelated random variables over $(\Omega,\mathcal{F},\mathbb{P})$.

Then yet another equivalent formulation of the question (as a general question about general probability spaces and sub-$\sigma$-algebras) turns out to be the following:

Does set-wise conditional independence given $\mathcal{H}$ imply uncorrelated conditional probabilities given $\mathcal{H}$?

To see the equivalence, we give the following two lemmas:

Lemma 1. If $\mathcal{G}_1$ and $\mathcal{G}_2$ are conditionally $\mathbb{P}$-independent given $\mathcal{H}$, then the following statements are equivalent:

  • $\mathcal{G}_1$ and $\mathcal{G}_2$ are $\mathbb{P}$-independent;
  • $\mathcal{G}_1$ and $\mathcal{G}_2$ have uncorrelated conditional probabilities under $\mathbb{P}$ given $\mathcal{H}$.

Lemma 2. If $\mathcal{G}_1$ and $\mathcal{G}_2$ are set-wise conditionally $\mathbb{P}$-independent given $\mathcal{H}$, then the following statements are equivalent:

  • $\mathcal{G}_1$ and $\mathcal{G}_2$ are conditionally $\mathbb{P}$-independent given $\mathcal{H}$;
  • for every $H \in \mathcal{H}$ with $\mathbb{P}(H)>0$, $\mathcal{G}_1$ and $\mathcal{G}_2$ have uncorrelated conditional probabilities under $\mathbb{P}_H$ given $\mathcal{H}$.

If set-wise conditional independence implies conditional independence, then by Lemma 1, set-wise conditional independence also implies uncorrelated conditional probabilities. And if set-wise conditional independence implies uncorrelated conditional probabilities, then applying this to $\mathbb{P}_H$ for all $H \in \mathcal{H}$ with $\mathbb{P}(H)>0$ (which we can do by virtue of the Remark further above), we have by Lemma 2 that set-wise conditional independence implies conditional independence.

Proof of Lemma 1. Assume $\mathcal{G}_1$ and $\mathcal{G}_2$ are conditionally $\mathbb{P}$-independent given $\mathcal{H}$. For any $G_1 \in \mathcal{G}_1$ and $G_2 \in \mathcal{G}_2$, we have
$$ \mathbb{P}(G_1)\mathbb{P}(G_2) \ = \ \mathbb{E}_\mathbb{P}[\mathbb{P}(G_1|\mathcal{H})]\mathbb{E}_\mathbb{P}[\mathbb{P}(G_2|\mathcal{H})] $$
and
$$ \mathbb{P}(G_1 \cap G_2) \ = \ \mathbb{E}_\mathbb{P}[\mathbb{P}(G_1|\mathcal{H})\mathbb{P}(G_2|\mathcal{H})]. $$

Proof of Lemma 2. Assume $\mathcal{G}_1$ and $\mathcal{G}_2$ are set-wise conditionally $\mathbb{P}$-independent given $\mathcal{H}$. Clearly the following statements are equivalent:

  • $\mathcal{G}_1$ and $\mathcal{G}_2$ are conditionally $\mathbb{P}$-independent given $\mathcal{H}$;
  • for every $H \in \mathcal{H}$ with $\mathbb{P}(H)>0$,
    $$ \frac{1}{\mathbb{P}(H)}\int_H \mathbb{P}(G_1 \cap G_2|\mathcal{H}) \, d\mathbb{P} \ = \ \frac{1}{\mathbb{P}(H)}\int_H \mathbb{P}(G_1|\mathcal{H})\mathbb{P}(G_2|\mathcal{H}) \, d\mathbb{P}. $$

Now in this last equality,
$$ \mathrm{RHS} \ = \ \mathbb{E}_{\mathbb{P}_H\!}[\mathbb{P}_H(G_1|\mathcal{H})\mathbb{P}_H(G_2|\mathcal{H})] $$
and using set-wise conditional independence,
$$ \mathrm{LHS} \ = \ \mathbb{P}_H(G_1 \cap G_2) \ = \ \mathbb{P}_H(G_1)\mathbb{P}_H(G_2) \ = \ \mathbb{E}_{\mathbb{P}_H\!}[\mathbb{P}_H(G_1|\mathcal{H})]\,\mathbb{E}_{\mathbb{P}_H\!}[\mathbb{P}_H(G_2|\mathcal{H})]. $$


Further thoughts towards solving the problem (and more generally towards understanding set-wise conditional independence).

1. One question we could ask is whether, in general, conditional probabilities of independent events are uncorrelated, or perhaps even independent, random variables. The answer is no.

(The independence question has actually already been addressed in Math.SE, here.)

Take the probability space $\{1,2\} \times \{1,2\}$ with the uniform distribution. The events $A:=\{\textrm{1st coordinate is }1\}$ and $B:=\{\textrm{2nd coordinate is }1\}$ are obviously independent; but letting $C=\{(1,1)\}$, we have
\begin{align*}
\mathrm{Prob}(A|C) = \mathrm{Prob}(B|C) &= 1 \\
\mathrm{Prob}(A|C^c) = \mathrm{Prob}(B|C^c) &= \tfrac{1}{3}.
\end{align*}

The events $\{\mathrm{Prob}(A|\sigma(\{C\}))=1\}$ and $\{\mathrm{Prob}(B|\sigma(\{C\}))=1\}$ are certainly not independent events, but are in fact the same non-trivial event $C$ itself. Moreover, in this example, the random variables $\mathrm{Prob}(A|\sigma(\{C\}))$ and $\mathrm{Prob}(B|\sigma(\{C\}))$ are strictly positively correlated:
\begin{align*}
\mathrm{Exp}[\mathrm{Prob}(A|\sigma(\{C\})).\mathrm{Prob}(B|\sigma(\{C\}))] \ &> \ \mathrm{Prob}(C).\mathrm{Prob}(A|C).\mathrm{Prob}(B|C) \\
&= \ \tfrac{1}{4} \ = \ \mathrm{Prob}(A).\mathrm{Prob}(B).
\end{align*}

2. We have the following:

Lemma 3. Let $G_1,G_2 \in \mathcal{F}$ be $\mathbb{P}$-independent events. Then the set $\Lambda$ of all $H \in \mathcal{F}$ that are $\mathbb{P}$-independent of $G_1$ and fulfil $(\ast)$ forms a $\lambda$-system on $\Omega$.

This immediately follows from the following:

Lemma 4. For any $G_1,G_2,H_1,H_2 \in \mathcal{F}$ with $H_1 \subset H_2$, if

  • at least one of $G_1$ and $G_2$ is both $\mathbb{P}$-independent of $H_1$ and $\mathbb{P}$-independent of $H_2$, and
  • $(\ast)$ is fulfilled with $H:=H_1$ and with $H:=H_2$

then $(\ast)$ is fulfilled with $H:=H_2 \setminus H_1$.

Proof. First note that for any $G_1,G_2,H \in \mathcal{F}$ with $G_1$ and $H$ being $\mathbb{P}$-independent, $(\ast)$ is equivalent to the statement that $G_1$ and $G_2 \cap H$ are $\mathbb{P}$-independent. So now assume the conditions of the Lemma, with $H_1$ and $H_2$ each being $\mathbb{P}$-independent of $G_1$, from which it follows that $H_2 \setminus H_1$ is $\mathbb{P}$-independent of $G_1$. We know that $G_2 \cap H_1$ and $G_2 \cap H_2$ are each $\mathbb{P}$-independent of $G_1$, from which it follows that $G_2 \cap (H_2 \setminus H_1)$ is $\mathbb{P}$-independent of $G_1$. Hence the result.

3. Let us now consider a relatively basic case of set-wise conditional independence:

Proposition 5. Let $\mathcal{G}_1,\mathcal{G}_2,\mathcal{G}_3$ be mutually $\mathbb{P}$-independent sub-$\sigma$-algebras of $\mathcal{F}$.

  • For any $\,G_1,\tilde{G}_1 \in \mathcal{G}_1$, $\ G_2,\tilde{G}_2 \in \mathcal{G}_2\,$ and $\,\tilde{G}_3 \in \mathcal{G}_3$, $(\ast)$ is fulfilled with $H:=\tilde{G}_1 \cap \tilde{G}_2 \cap \tilde{G}_3$.
  • Let $\mathcal{H}$ be either $\sigma(\mathcal{G}_1 \cup \mathcal{G}_3)$ or $\sigma(\mathcal{G}_2 \cup \mathcal{G}_3)$. Then $\mathcal{G}_1$ and $\mathcal{G}_2$ are set-wise conditionally $\mathbb{P}$-independent given $\mathcal{H}$, and are also conditionally $\mathbb{P}$-independent given any sub-$\sigma$-algebra $\tilde{\mathcal{H}}$ of $\mathcal{H}$.

Proof. The first part is a straightforward verification: both sides of the equation $(\ast)$ simplify to
$$ \mathbb{P}(G_1 \cap \tilde{G}_1)\mathbb{P}(G_2 \cap \tilde{G}_2)\mathbb{P}(\tilde{G}_1)\mathbb{P}(\tilde{G}_2)\mathbb{P}(\tilde{G}_3)^2. $$
Now let $\mathcal{H}=\sigma(\mathcal{G}_2 \cup \mathcal{G}_3)$. The set-wise conditional independence follows from the first part of the proposition by Lemma 3 (using the Dynkin $\pi$$\lambda$ theorem). As for the conditional independence given sub-$\sigma$-algebra $\tilde{\mathcal{H}}$ of $\mathcal{H}$: for any $G_1 \in \mathcal{G}_1$ and $G_2 \in \mathcal{G}_2$, since $G_1$ is independent of $\sigma(\{G_2\} \cup \tilde{\mathcal{H}})$ we have
$$ \mathbb{P}(G_1 \cap G_2|\tilde{\mathcal{H}}) \ = \ \mathbb{P}(G_1)\mathbb{P}(G_2|\tilde{\mathcal{H}}) \ = \ \mathbb{P}(G_1|\tilde{\mathcal{H}})\mathbb{P}(G_2|\tilde{\mathcal{H}}). $$

Definition. I will say that $\mathcal{G}_1$ and $\mathcal{G}_2$ are trivially set-wise conditionally $\mathbb{P}$-independent given $\mathcal{H}$ if $\mathcal{G}_1$ and $\mathcal{G}_2$ are $\mathbb{P}$-independent and there exists a sub-$\sigma$-algebra $\mathcal{G}_3$ of $\mathcal{F}$ that is $\mathbb{P}$-independent of $\sigma(\mathcal{G}_1 \cup \mathcal{G}_2)$, such that $\mathcal{H}$ is a sub-$\sigma$-algebra of either $\sigma(\mathcal{G}_1 \cup \mathcal{G}_3)$ or $\sigma(\mathcal{G}_2 \cup \mathcal{G}_3)$.

So Proposition 5 gives an affirmative answer to my question in the case of "trivial" set-wise conditional independence; but I don't know if it's possible to have "non-trivial" cases of set-wise conditional independence.

Best Answer

The answer is yes.

Proof. Suppose $\mathcal{G}_1$ and $\mathcal{G}_2$ are set-wise conditionally independent given $\mathcal{H}$. First note that for any finite sub-$\sigma$-algebra $\tilde{H}$ of $\mathcal{H}$, if we let $H_1,\ldots,H_n$ be a partition of $\Omega$ that generates $\tilde{H}$ then for each $i$ with $\mathbb{P}(H_i) > 0$, for each $G_1 \in \mathcal{G}_1$ and $G_2 \in \mathcal{G}_2$ we have $$ \mathbb{P}(G_1 \cap G_2 | \tilde{H})(\omega) = \mathbb{P}_{H_i}(G_1 \cap G_2) = \mathbb{P}_{H_i}(G_1)\mathbb{P}_{H_i}(G_2) = \mathbb{P}(G_1| \tilde{H})(\omega)\mathbb{P}(G_2 | \tilde{H})(\omega) $$ for $\mathbb{P}$-a.a. $\omega \in H_i$.

Now fix $G_1 \in \mathcal{G}_1$ and $G_2 \in \mathcal{G}_2$, and fix versions of the conditional probabilities $\mathbb{P}(G_1|\mathcal{H})$, $\mathbb{P}(G_2|\mathcal{H})$ and $\mathbb{P}(G_1 \cap G_2|\mathcal{H})$. Let $\{B_n\}_{n \geq 1}$ be a countable generator of $\mathcal{B}([0,1])$, and for each $n$ let $\mathcal{B}_n=\sigma(B_1,\ldots,B_n)$ and let $$ \tilde{H}_n \ = \ \sigma(\,\mathbb{P}(G_1|\mathcal{H})^{-1}(\mathcal{B}_n) \,\cup\, \mathbb{P}(G_2|\mathcal{H})^{-1}(\mathcal{B}_n) \,\cup\, \mathbb{P}(G_1 \cap G_2|\mathcal{H})^{-1}(\mathcal{B}_n)\,). $$ Let $\tilde{H}_\infty=\sigma(\tilde{H}_n:n \geq 1) \subset \mathcal{H}$. Note that $\mathbb{P}(G_1|\mathcal{H})$, $\mathbb{P}(G_2|\mathcal{H})$ and $\mathbb{P}(G_1 \cap G_2|\mathcal{H})$ are all $\tilde{H}_\infty$-measurable, and hence are $\mathbb{P}$-almost surely equal to $\mathbb{P}(G_1|\tilde{H}_\infty)$, $\mathbb{P}(G_2|\tilde{H}_\infty)$ and $\mathbb{P}(G_1 \cap G_2|\tilde{H}_\infty)$ respectively. Now for each $n$, since $\tilde{H}_n$ is finite we have that $$ \mathbb{P}(G_1 \cap G_2 | \tilde{H}_n) \ \overset{\mathbb{P}\textrm{-a.s.}}{=} \ \mathbb{P}(G_1 | \tilde{H}_n)\mathbb{P}(G_2 | \tilde{H}_n). $$ Hence Lévy's upward theorem gives the result with $\tilde{H}_\infty$ in place of $\tilde{H}_n$.

Related Question