Question about when Channel Capacity is zero

coding-theorycomputer scienceinformation theorymatrices

There is a channel with input alphabet $A$ and output alphabet $B$. Suppose that the mutual information $I(A,B)|_{p_1,…,p_n} = 0$ for some input frequencies $p_1,…,p_n$. Show that the channel capacity is zero.

I know that when the mutual information is zero, the two events A and B are statistically independent. In this case, the two systems (i.e. the inputs and outputs for the channel) are statistically independent. In this case, what would the matrix of transition probabilities look like if the events were statistically independent? Do the transition probabilities depend only on the output in this case?

Best Answer

The channel capacity depents on two things:

  1. The channel transition probabilities, $\mathbb{P}(y|x), y \in B, x \in A$;
  2. The input distribution $p(x), x\in A$.

Note that the transition probabilities are a property of channel, which is independent of the input distribution. (At least, there is no implication that this is not the case here.)

Now, by assumption, there is an input distribution $\{p(x)\}_{x\in A}=\{p_1, p_2, \ldots, p_n$}, for which the mutual information is zero. This can happen if and only if it holds $\mathbb{P}(y|x) = \mathbb{P}(y)$, i.e., output is independent of input. But, since the transition probabilities are independent of input distribution, it must hold $\mathbb{P}(y|x)=\mathbb{P}(y)$ for any input distribution. Therefore, capacity can only be zero.

Related Question