Is the Rényi entropy a continuous function with respect to the parameter $\alpha$

entropyinformation theoryprobabilityrenyi-entropy

The Rényi entropy of order $\alpha$, where $\alpha > 0$ and $\alpha \neq 1$, is defined as
$$
\mathrm{H}_\alpha(X)=\frac{1}{1-\alpha} \log \left(\sum_{i=1}^n p_i^\alpha\right)
$$

Here, $X$ is a discrete random variable with possible outcomes in the set $\mathcal{A}=\left\{x_1, x_2, \ldots, x_n\right\}$ and corresponding probabilities $p_i \doteq \operatorname{Pr}\left(X=x_i\right)$ . I would like to know if $\mathrm{H}_\alpha(X)$, considered as a function of $\alpha$, is a continuous function.

I think that, if the probability distribution function $p_i$ is $\neq0$ for all $i$, then $\mathrm{H}_\alpha(X)$ will be a continuous function of $\alpha$. This is because the sum and logarithm operations in the entropy formula are continuous functions, and the composition of continuous functions remains continuous. Is it true also if $p_i=0$ for some $i$?

Best Answer

Yes it is continuous for the case $p_i\neq 0,$ as you pointed out.

In information theory, including for Shannon entropy, quantities such as $$ p_i \log( 1/p_i) $$ are continuous on $p \in (0,\infty)$ and by convention are taken to obey $$ p \log(1/p)\rightarrow 0,\quad \textrm{as}~p\rightarrow 0. $$ The Renyi entropy $H_\alpha(X)$ is defined and continuous for $\alpha \in [0,1)\cup (1,\infty).$ Its problem regarding discontinuity is not $p_i=0$ but the value of $\alpha=1.$ Rewriting $H_\alpha(X)$ with the constant $\alpha/(1-\alpha)$ in front makes this clear since the quantity inside the log is now simply the $\alpha$ norm of a probability distribution, i.e., $$ \left(\sum_i p_i^\alpha\right)^{1/\alpha} $$

which is finite for $\alpha$ restricted to the domain above.

Also, the Renyi entropy actually converges to the Shannon entropy as $\alpha\rightarrow 1.$