According to wikipedia, Rényi entropy for a random variable $X$ is defined as (for $\alpha \ge 0, \alpha \ne 1)$:
$$
H_\alpha(X) = \frac{1}{1 – \alpha} \log\left(\sum_i^n p_i^\alpha\right).
$$
Shannon entropy is a special case of this entropy for $\lim_{\alpha \to 1}$. The proof of this is present in this math.stackexchange link. Now, I am interested in another special case of Rényi entropy where $\lim_{\alpha \to \infty}$. It is stated in the wikipedia that, in this case, Rényi entropy converges to min-entropy($H_\infty(X)$), which is:
$$
H_\infty(X) = -\log \underset{i}{\max} p_i .
$$
Now, here is my question. How do I prove that the following result holds?
$$
\lim_{\alpha \to \infty} H_\alpha(X) = \lim_{\alpha \to \infty} \frac{1}{1 – \alpha} \log\left(\sum_i^n p_i^\alpha\right) = -\log \underset{i}{\max} p_i = H_\infty(X)
$$
Thanks in advance.
Best Answer
This is a sketch of an argument, I will leave a formal proof to a more capable contributor
$$ \frac{1}{1 - \alpha} \approx -\frac{1}{a} \tag{1} $$
$$ \sum_i p_i^a \approx \max_i p_i^a \tag{2} $$
$$ H_a(X) \approx -\frac{1}{a} \log (\max_i p_i)^a = -\log (\max_i p_i)^{a/a} = -\log \max_i p_i $$