Information Theory – Mutual Information and Bivariate Function of Independent Variables

it.information-theory

Let $X, Y, Z$ be discrete random variables with $X$ and $Y$ independent of $Z$, while $X$ and $Y$ can be dependent. For the mutual information, we have $I(X; Y,Z) = I(X;Y)$. Now consider $I(X; f(Y,Z))$ for some deterministic function $f$. Does $I(X; f(Y,Z))$ depend on $Z$? If not, is there a way to express $I(X; f(Y,Z))$ in terms of $X$ and $Y$ only?

Best Answer

$I(X;f(Y, Z))$ can depend on the $Z$ (or more specifically the distribution of $Z$). Consider the following example, $Z \sim B(p)$, $X \sim B(0.5)$, $Y=X$ satisfying $X,Y \perp \!\!\! \perp Z$. Let $F=\max(Y, Z)$ be a variable from the output of a deterministic function of $Y, Z$. $X, F$ have the following joint distribution,

  1. $P(X=0,F=0) = (1-p)/2$.
  2. $P(X=0,F=1) = p/2$.
  3. $P(X=1,F=0) = 0$.
  4. $P(X=1,F=1) = 1/2$.

$I(X;F) = \log(2) + \frac{p}{2}\log(p) - \frac{1+p}{2}\log(1+p)$ which is dependent on $p$.

Related Question