[Math] Unlike Jensen’s Inequality, can we upper bound $\log \sum_{i}{u_i \exp(x_i)}$

analysiscalculusconvex-analysis

According to Jensen's Inequality, it is not hard to derive the lower bound for $\log \sum_{i}{u_i \exp(x_i)}$ due to the convexity of $\log(\cdot)$:

$\log \sum_{i}{u_i \exp(x_i)} \geq \sum_{i}{u_i x_i}$.

However, my question is, can we somehow upper bound the quantity. The idea is that the set $\{x_i\}$ is bounded by $\min x_i$ and $\max_i$ and the second-order derivative of $\log(\cdot)$ is also bounded in the interval $[\min x_i, \max x_i]$. Therefore, we can upper bound it but the question is to what simplicity the bound could be.

I have tried some combinations under some numerical test, and found this bound might work:

$\log \sum_{i}{u_i \exp(x_i)} \leq \sum_{i}{u_i x_i}+\frac{1}{8}\Big(\max x_i\Big)^2$.

But I do not have a way to prove it. Any suggestions? Thanks.

Best Answer

Fortunately, there do exists results from mathematical research that give the upper bound. I made some digging on the internet and found the following results. SLAVKO SIMIC [1] shows that for convex mapping $f$ on the interval $[a,b]$, the following bound exists:

$\sum\limits_i{p_i f(x_i)} \leq \sum\limits_i{ f(p_i x_i)}+f(a)+f(b)-f(\frac{a+b}{2})$.

Since $\log(\cdot)$ is a concave function, we should have

$\sum\limits_i{ \log(p_i x_i)} \leq \sum\limits_i{p_i \log(x_i)}+\log(a)+\log(b)+\log(\frac{a+b}{2})$.

Some other upper bounds have also been proposed. For example, in machine learning, Jebara et al. reversed Jensen's inequality for exponential families [2].

In fact, I found the previous guess has the correct form. It is not hard to show with SLAVKO SIMIC's result that

$\log(\sum\limits_{i}{p_i \exp(x_i)}) \leq \sum\limits_{i}{p_i\log( \exp(x_i))}+2 ||x||_{\infty}^2$.

Related Question