Maximum value of Kullback–Leibler entropy

entropy

The measure of Shannon Entropy should be maximal if all the outcomes are equally likely (uncertainty is highest when all possible events are equiprobable). I.e., as we can see in Wikipedia:
$$\mathrm {H} _{n}(p_{1},\ldots ,p_{n})\leq \mathrm {H} _{n}\left({\frac {1}{n}},\ldots ,{\frac {1}{n}}\right)=\log _{b}(n).$$
The Kullback–Leibler divergence from ${\displaystyle Q}$ to ${\displaystyle P}$ is defined to be
$$D_{\text{KL}}(P\parallel Q)=\sum _{x\in {\mathcal {X}}}P(x)\log \left({\frac {P(x)}{Q(x)}}\right).$$
Let $\sum_x P(x)=\sum_x Q(x)=1$. Is it known a relation for the maximum values of $D_{KL}$, like the one existing for $H_n$?

Best Answer

You should be more specific. If you consider the maximum w.r.t. $Q(x)$, then there does not exist the maximum. Since, simply for a $x \in \mathcal{X}$, consider the term $P(x)\log \left( \frac{P(x)}{Q(x)}\right)$, you can choose $Q(x)$ arbitrary small, and hence the related term, arbitrary large.

If you want to maximize w.r.t. $P(x)$, it's rather straightforward to check that the maximum is: $$\log_2\left(\frac{1}{\min \limits_{x \in \mathcal{X}} Q(x)}\right).$$ The distribution of such a $P(x)$ is also as following: it has probability one for the symbol that minimizes $Q(x)$, and zero, otherwise.