Kullback-Leibler divergence implies AM-GM inequality

a.m.-g.m.-inequalityinequalityprobability

How do I show arithmetic mean is greater than or equal to geometric mean given non-negativity of KL divergence?

i.e. Given $D(p||q)=\sum_i p_i \log\left(\frac{p_i}{q_i}\right)\geq 0$, show that $\prod_{i=1}^n x_i^{a_i}\leq\sum_i a_ix_i,\;\forall a_i\geq 0\; s.t.\;\sum_i a_i=1$

I tried substituting $p_i=a_i$ and $q_i=x_i$ but I am only able to get G.M. on one side but not A.M. on the other.

Best Answer

Problem: Assume that we know the following:

$\sum p_i\log \frac{p_i}{q_i} \ge 0$ for $p_i>0 , q_i>0, \forall i$ and $\sum p_i = 1, \sum q_i = 1 $.

Prove that $\prod x_i^{a_i} \le \sum a_i x_i$ for $x_i> 0, a_i > 0, \forall i$ and $\sum a_i = 1$.

Solution: By taking logarithm on both sides, the inequality to be proved is written as $$\sum\nolimits_i a_i\log x_i \le \log \sum\nolimits_i a_i x_i = (\sum\nolimits_i a_i)\log \sum\nolimits_i a_i x_i = \sum\nolimits_i a_i\log \sum\nolimits_j a_j x_j $$ or $$\sum\nolimits_i a_i \Big(\log \sum\nolimits_j a_j x_j - \log x_i\Big)\ge 0$$ or $$\sum\nolimits_i a_i \Big(\log a_i - \log \frac{a_ix_i}{\sum\nolimits_j a_j x_j}\Big)\ge 0.\tag{1}$$ Let $$p_i = a_i, \quad q_i = \frac{a_ix_i}{\sum_j a_j x_j}, \quad i=1, 2, \cdots, n.$$ We know that (1) holds.

We are done.

Related Question