Mutual information with two independent variables

entropyinformation theory

Let us say we have three random variables, $A$, $B$, and $C$, where $A$ and $B$ are independent. I know that $$I(A;B) = 0.$$ Also, my intuition is that $$I(A;B,C) = I(A;C).$$
However, I cannot either prove or disprove this. Is my intuition correct?

UPD: the question has been answered. See also a follow-up question.

Best Answer

This is not true. Take $A,B$ to be two independent Rademacher random variables, i.e., uniform on $\{-1,1\}$; and let $C=AB$.

$A,B$ are independent, so $I(A;B) = 0$. Similarly, $A,C$ are independent, so that $I(A;C) = 0$. However, $A=BC$, so $I(A;(B,C)) = 1 \neq 0$.

(Instead of Rademacher, if you prefer you can equivalently consider $A,B$ to be two independent uniformly random bits, and let $C=A\oplus B$ be their XOR. This is literally the same.)

Related Question