[Math] calculate channel capacity and maximum conditional entropy

conditional probabilityentropyprobabilityprobability theory

i want to know when it is equal channel capacity or $I(X,Y)$ maximum or

enter image description here
where

$I(X,Y)=H(X)-H(X\mid Y)=H(Y)-H(Y\mid X)$

now if we have two random variable with some specific distribution function,we can calculate it's mutual information easily right?but for getting maximum one,we may change their probability distributions,but of course question arises :how many times we should change it?there should be right some limit,for example we have following table

enter image description here

with some probability distribution functions,we can calculate for instance

$H(Y\mid X)$ ,actually it is calculated so we get $13/8$,so we have

$I(X,Y)=H(Y)-H(Y\mid X)=2-13/8=3/8$

but is it maximum?or how can i calculate maximum mutual information?should i assign different probabilities or?thanks in advance

Best Answer

The capacity of a channel depends, not on the input of the output but on the channel. In the standard probabilistic model, with memoryless property, the channel determines the conditional (transition) probabilities $P(Y|X)$ ; furthermore, if both the input $X$ and the output $Y$ have finite alphabets, the conditional probabilities can be represented as a matrix (transition matrix).

Hence, given (fixed) the transition probability matrix $P(Y|X)$, for each possible probability distribution of the input $P(X)$ you can obtain the joint $P(X,Y)$ as well as the output marginal $P(Y)$ - and from that, you can compute the mutual information $I(X,Y)$. The task is to find, for all possible $P(X)$, the one which maximizes $I(X,Y)$.

Related Question