Inequalities – Convexity of the Exponential of the Negative Renyi Entropy

convexityentropyinequalities

I would like to try my luck here for the following question after failing to elicit an answer to it on math.stackexchange.com.


For $r\ge -1$, the exponential of the negative Renyi entropy is defined as
$$M(p):=\Big(\sum_i p_i^{1+r}\Big)^{\frac1r},$$
for a probability measure as tuples $p:=(p_i)_i$
I would like to prove the convexity of $M(\cdot)$, or
$$M(ap+bq)\le aM(p)+bM(q),$$
$\forall\,a+b=1 \wedge a,b\ge0$, and two probability measures $p$ and $q$ with the same cardinalities.

For $r>0$, I can show the convexity
via the Minkowski inequality for $\big(\sum_i x_i^{1+r}\big)^{\frac1{1+r}}$ then the convexity of $f(x):=x^{1+\frac1r}$.

But how would one show the convexity for $-1<r<0$? The above technique does not work since the inequality signs from the two steps point in the opposite directions.

Best Answer

Take any $r\in(-1,0)$, any vector $(p_i)_{i=1}^n$ with $p_i>0$ for all $i$, and any vector $(h_i)_{i=1}^n\in\mathbb R^n$. For all real $t$ close enough to $0$, let $$g(t):=M(p+th).$$ Then $$g''(0)=(1+r)\Big(\sum_{i=1}^n p_i^{r-1}h_i^2\,\sum_{i=1}^n p_i^{1+r} +\frac{1-r^2}{r^2}\,\Big(\sum_{i=1}^n p_i^r h_i\Big)^2\Big) \Big(\sum_{i=1}^n p_i^{1+r}\Big)^{1/r-2},$$ which is manifestly $\ge0$.

It follows that $M$ is indeed convex (actually, convex on the entire positive orthant of $\mathbb R^n$).