Calculate capacity of a channel with given transition probability matrix

information theoryprobabilityrandom variables

I have a channel with input $X$ which can take values $\{1,2,\ldots n\}$ and output $Y$ which can take values $\{0,1\}$. The transition probabilities are as shown in the below matrix:

$\left[\begin{array}{cc}
\mathrm{p}_{1} & 1-\mathrm{p}_{1} \\
\mathrm{p}_{2} & 1-\mathrm{p}_{2} \\
\mathrm{p}_{3} & 1-\mathrm{p}_{3} \\
\mathrm{p}_{4} & 1-\mathrm{p}_{4} \\
\vdots & : \\
\cdot & \cdot \\
\mathrm{p}_{\mathrm{n}} & 1-\mathrm{p}_{\mathrm{n}}
\end{array}\right]$

Here, rows correspond to inputs and columns correspond to the 2 possible outputs.

I am interested in a more general value $n$ for the number of inputs. Is there a way of computing this capacity analytically for a general value of $n$?

Best Answer

Let $a_i = P(X=x_i)$, $b_j = P(Y=j)$, $p_i = P(Y=0| X=x_i)$, $q_i = 1-p_i$

Then $b_0 = ({\bf a} \cdot {\bf p})$ and $b_1 = ({\bf a} \cdot {\bf q})$ and

$$\begin{align} I(X;Y) &= \sum_{x,y} P_x P_{y|x} \log \frac{P_{y|x}}{P_y} \\ &=\sum_{i} a_i (p_i \log \frac{p_i}{b_0} + q_i \log \frac{q_i}{b_1})\\ &= \sum_{i} a_i D({\bf p}_i || {\bf b}) \end{align} $$ Assume WLOG that $p_i < p_{i+1}$.

It's let as an exercise to the reader to show, using the convexity properties of the relative entropy $D(\cdot ||\cdot )$ that this is maximized for ${\bf a}=(a_1, 0, 0, \cdots ,0 , a_n) $. Then, this is equivalent to a binary input channel.

Related Question