Proof that $H(X) \leq \log(|A|)$ (Shannon entropy)

entropyinformation theorylogarithmsprobability

The full question states:

"Show that $$H(X) \leq \log(|A|)$$
with equality if and only if $P_X$ is uniform. Hint: use the Gibbs or log-sum inequality "

I used "$A$" as the alphabet in here. My lecturer is using curly "X" which I don't know how to type in MathJax.

The equality part is rather straight forward.

Suppose $A=\{1,…,m\}$ and $P_X(x)=\frac{1}{m} \forall x$. Then of course $|A|=m$
And $$H(X)=-\sum_{x=1}^m \frac{1}{m}\log(\frac{1}{m})=-\log(\frac{1}{m})=\log(m)=\log(|A|)$$

It is the inequality part that I am not sure how to show. Thanks for any help

Best Answer

For any probability distributions $p, q$ on $A$, Gibbs inequality states that $$ H(p)=-\sum_{k\in A} p(k)\log p(k)\leq -\sum_{k\in A} p(k)\log q(k) $$ with equality iff $p=q$. Take $q$ to correspond to a uniform distribution on $A$. Then $$ H(p)\leq -\sum_{k\in A} p(k)\log \frac{1}{|A|}=\log|A| $$ with equality iff $p$ is a uniform distribution as desired.

Related Question