Maximum value of Rényi entropy

entropyinformation theoryprobabilityrenyi-entropy

Given a discrete random variable $X$, which takes values in the alphabet $\mathcal {X}$ and is distributed according to $p:{\mathcal {X}}\to [0,1]$ the Shannon entropy is defined as:
$$\mathrm {H} (X):=-\sum _{x\in {\mathcal {X}}}p(x)\log p(x)$$
As we know (see e.g. Prove the maximum value of entropy function) the maximum value of the Shannon entropy is $\ln N$ where $N=\operatorname{card}(\mathcal X)$.

The Rényi entropy of order $\alpha$, where $\alpha \geq 0$ and $\alpha \neq 1$, is defined as
$$\mathrm {H} _{\alpha }(X)={\frac {1}{1-\alpha }}\log {\Bigg (}\sum _{i=1}^{N}p_{i}^{\alpha }{\Bigg )}$$
My question is: is it possible to find an upper bound also for the Rényi entropy?

Best Answer

From the Wikipedia page:

The Rényi entropy for any $\alpha \geq 0$ is Schur concave.

This implies that the maximum is still achieved for the uniform distribution, i.e., $$ H_\alpha(X) \leq \frac{1}{1-\alpha} \log \!\left(\sum_{i=1}^N \frac{1}{N^{\alpha}}\right) = \log N $$ for any $\alpha \geq 0$ and $X$ supported on a domain of size $N$.

Another way to find this upper bound (which one can then check is achieved for the uniform distribution in the domain) is to use the fact that $H_\alpha(X)$ is a non-increasing function of $\alpha\geq 0$, so $H_\alpha(X) \leq H_0(X) = \log N$.