Mutual information between dependent and independent variables

entropyinformation theory

This question is a follow-up for the previously asked question. Assume we have three random variables $A$, $B$, and $C$, where $A$ and $B$ are independent (i.e. $I(A \,; B)=0$), but the relation between $C$ and $A$, $B$ is arbitrary. The question is how to simplify the following:
$$
I(A \,; B,C).
$$

Best Answer

Independence of $A$ and $B$ also means that $H(A \mid B) = H(A)$. Next, by using definition of mutual information and chain rule for entropies: $$ I(A \,; B, C) = H(A) - H(A \mid B, C) = H(A \mid B) - H(A \mid B, C) = I(A \,; C \mid B). $$

Related Question