Solved – Entropy of a function of independent random variables

cross entropyentropyinformation theorymutual information

Suppose I have an operator (function) $f(\cdot)$ which takes three arguments $x,y,z$ all of which are independent random variables, and all of which I have access to the probability mass function (they are discrete random variables). Further, let's say I have an equality of the following form:

$$
w = f(x,y,z)
$$
where w is also a discrete random variable (to which I also know the PMF). Let us imagine that $f(\cdot)$ is something simple like

$$
f(x,y,z) = x+y -z
$$

Now, imagine we are interested in the Shannon entropy of this relation, and we apply it thus:

$$
H(w) = H(f(x,y,z)).
$$

Is the following true:

\begin{align}
H(w) &= H(f(x,y,z)) \\
&= H(x,y,z) \\
&= H(x)+ H(y) + H(z). \\
\end{align}
Consequently,

\begin{align}
H(w) &= H(x)+ H(y) + H(z). \\
\end{align}

Reasoning:

  1. We can drop $f(\cdot)$ because the functional form does not affect the distribution nor the independence of the random variables.
  2. We can express the joint entropy as a sum because the random variables are all independent, hence any mutual information factors are zero.

Is this correct? And could anyone recommend good resources for taking the entropy of functions with random variables?

Update 1

Consider this example, we are in possession of three independent discrete random variables with this relationship:
$$
Z = Y + X
$$

Lets say we can approximate the entropy of $Z$ as so $H[Z]$ – the left-hand side. The right-hand side, however, takes the form:

$$
H[Y+X]
$$

Now, let us suppose that both $Y$ and $X$ are normally distributed, that means we are taking the entropy of a sum. Hence, if:

$$
\begin{align}
X &\sim N(\mu _{X},\sigma _{X}^{2}) \\
Y &\sim N(\mu _{Y},\sigma _{Y}^{2})
\end{align}
$$
then that means that:

$$
X+Y \sim N(\mu _{X}+\mu _{Y},\sigma _{X}^{2}+\sigma _{Y}^{2}).
$$

Consequently, because we know that the entropy of a Gaussian random variable is:

$$
\frac{1}{2}\log(2\pi e\sigma ^{2})
$$
this would imply that the entropy of a sum of independent Gaussian random variables is:
$$
\frac{1}{2}\log \left [2\pi e \left(\sigma _{X}^{2}+\sigma _{Y}^{2} \right) \right].
$$

Best Answer

Regarding the first point in your reasoning, since we do not know the pmf of w, or any other information about $f(x,y,z)$ (in the general case), $f$ can not be dropped.

Regarding your second point, what can be said with certainty is that $H(w) \leq H(x) + H(y) + H(z) $, in this specific case.

Please refer to this for the proof of this: https://www2.isye.gatech.edu/~yxie77/ece587/SumRV.pdf

Hope this helps

Related Question