[Math] ny significance to the logarithm of a sum

logarithmsreal-analysis

Many years ago, while working as a computer programmer, I tracked down a subtle bug in the software that we were using. Management had dispaired of finding the bug, but I pursued it in odd moments over a period of a few days, and finally found that the problem was that, in computing the geometric mean, the program was taking the log of the sum instead of the sum of the logs. When thinking back on that, I always wonder whether there is any situation in which taking the log of a sum would be of interest. The only case I can think of is that it is often convenient to shift the logarithm to the left by 1 unit, which is done by adding 1 to the argument, that is, one often wishes to deal with log(1 + x) instead of log(x), so that one has the convenient situation of f(0) = 0. Let’s call this the trivial scenario.

So, can anyone think of any non-trivial scenario in which taking the log of a sum is the thing to do?

Best Answer

The logarithm is a concave function. This means that Jensen's inequality can be applied to it, giving the Log sum inequality, an useful lemma in information theory.

Lemma (Log sum inequality) Let $a_i\dots a_n$ and $b_1\dots b_n$ be nonnegative reals. Then we have $$\sum_{i=1}^n a_i\log{\frac{a_i}{b_i}} \ge \left(\sum_{i=1}^n a_i\right)\log{\frac{\sum_{i=1}^n a_i}{\sum_{i=1}^n b_i}}$$

On the right hand side we recognize two logarithms of a sum.

Remark By convention, $0\log{0} = 0\log{\frac{0}{0}}=0$ and $a\log{\frac{a}{0}}=\infty$ for $a>0$. All these are justified by continuity.

Related Question