Prove that for a random variable, Rényi entropy for $\alpha = \infty$ converges to min-entropy

entropylogarithmsprobability

According to wikipedia, Rényi entropy for a random variable $X$ is defined as (for $\alpha \ge 0, \alpha \ne 1)$:

$$
H_\alpha(X) = \frac{1}{1 – \alpha} \log\left(\sum_i^n p_i^\alpha\right).
$$

Shannon entropy is a special case of this entropy for $\lim_{\alpha \to 1}$. The proof of this is present in this math.stackexchange link. Now, I am interested in another special case of Rényi entropy where $\lim_{\alpha \to \infty}$. It is stated in the wikipedia that, in this case, Rényi entropy converges to min-entropy($H_\infty(X)$), which is:

$$
H_\infty(X) = -\log \underset{i}{\max} p_i .
$$

Now, here is my question. How do I prove that the following result holds?
$$
\lim_{\alpha \to \infty} H_\alpha(X) = \lim_{\alpha \to \infty} \frac{1}{1 – \alpha} \log\left(\sum_i^n p_i^\alpha\right) = -\log \underset{i}{\max} p_i = H_\infty(X)
$$

Thanks in advance.

Best Answer

This is a sketch of an argument, I will leave a formal proof to a more capable contributor

  1. for $\alpha \gg 1$, you have that

$$ \frac{1}{1 - \alpha} \approx -\frac{1}{a} \tag{1} $$

  1. Since $0 \le p_i \le 1$, if $p_i < p_j$, then $p_i^a < p_j^a$. That means that of all the $p$'s, only the largest one ($\max_i p_i$) will survive inside the sum, all other terms will become negligible as $a$ increases. That is, for $a \gg 1$

$$ \sum_i p_i^a \approx \max_i p_i^a \tag{2} $$

  1. Now put together (1) and (2), for $a \gg 1$ you get

$$ H_a(X) \approx -\frac{1}{a} \log (\max_i p_i)^a = -\log (\max_i p_i)^{a/a} = -\log \max_i p_i $$

Related Question