Solved – Mutual information equals conditional mutual information

conditional probabilityinformation theoryinteractionmutual information

Consider three random variables $X,Y,Z$. It is standard that $I(X,Y|Z)=0$ if and only if $X,Y$ are conditionally independent given $Z$.
If instead we require $I(X,Y|Z)=I(X,Y)$, what do we get? What properties would have the probability distributions that satisfy such constraint?

Best Answer

You are in luck: this is a well-studied quantity under a different name. This is equivalent to the Interaction information or a measure of synergy being equal to zero. Consider the quantity $S(X,Y,Z) = I(X,Y|Z) - I(X,Y)$. In 2 this is identical to eq. 10 if you first re-write $I(X,Y|Z) = I(X, (Y,Z)) -I(X,Z)$, so that $S(X,Y,Z) = I(X, (Y,Z)) -I(X,Z) - I(X,Y)$. This quantity can be positive, negative or zero. If it is zero, it means that the information that $Y$ and $Z$ have about $X$ is additive. If it is negative it means that the information that $Y$ and $Z$ have about $X$ is redundant, and if it is positive the information is synergistic (i.e., Y and Z together have information about X that can not be gleaned from either one individually).

The implications for the probability distributions that satisfy this constraint are a little more complicated. In this paper, they relate the synergy score to maximum entropy distributions satisfying certain constraints. If this quantity is negative, it implies something about the existence of common ancestors according to this paper, but I don't think it has implications in your case.

Another way to look at this condition is using the Venn diagram here. Your condition is equivalent to the very center gray area having a zero (that's just the triple information/interaction information again).

Related Question