When is a convergent sum absolutely convergent

entropyinformation theorysummation

Let $P$ and $Q$ be two probability mass functions on a countably infinite alphabet $A$. Suppose that the relative entropy $D(P \Vert Q) = \sum_{x \in A} P(x)\log \frac{P(x)}{Q(x)}$ is finite. I am asked to show that $\sum_{x \in A} P(x) \left\vert \log \frac{P(x)}{Q(x)} \right\vert$ is finite as well. Thus far, I have considered writing the sum as the difference of a positive sum and a negative sum,
$$\sum_{x \in A} P(x) \left\vert \log \frac{P(x)}{Q(x)} \right\vert = \sum_{x : P(x) > Q(x)}P(x) \log \frac{P(x)}{Q(x)} – \sum_{x : Q(x) > P(x)} P(x)\log \frac{P(x)}{Q(x)} = A – B,$$
and noting that
$$D(P \Vert Q) = A + B < \infty,$$
so either $A$ and $B$ are both convergent or both divergent. I have not been able to find a reason why they must both be convergent so that $A – B$ is finite. I figure it may have to do with the fact that all the terms in each sum have the same sign, but can't see how that helps.

Best Answer

Let $f_x := P_x/Q_x$. Since you know $0 \le D < \infty,$ the question reduces to arguing that $$ \infty > \sum_x Q_x f_x |\log f_x| - D = -2\sum_{x : f_x \le 1} Q_x f_x \log f_x. $$

But $z \mapsto -z\log z$ lies in $[0,1/e]$ on $[0,1]$ (and of course $f_x \ge 0$), so $$ \sum_{x : f_x \le 1} Q_x \cdot (-f_x \log f_x) \le Q(f_x \le 1)/e \le 1/e.$$

Related Question