Does any probability distribution have an entropy defined

convergence-divergenceentropystatistics

There are lot of probability distribution having infinite moments, for example Cauchy distribution has even the first moment infinite. Therefore, often we cannot calculate second moments used in measures of dispersion or uncertainty (variation and standard deviation). In some application, for example finance, we could replace such measures with entropy. However, firstly I would like to know whether any probability distribution has defined entropy (in particular Shannon entropy).

In case of discrete distributions with finite number of possible outcome the entropy is
$$
H = -\sum_{i=1}^{n} p_i \log p_i,
$$

where $p_i$ is probability of ith outcome. Since $p_i > 0$ and number of terms in the sum is finite, the sum is defined and it is finite.

But I am getting stuck with case $n \rightarrow +\infty$. In this case I need to prove that the sum under condition that $\sum_{i=1}^{+\infty}p_i = 1$ converges.

Similarly in case of continuous distribution, I would need to prove that integral
$$
H = -\int_{\mathbb{R}} f(x) \log f(x) \mathrm{d}x
$$

for any real function satisfying $f(x) > 0\,\, \forall \in \mathbb{R}$ and $\int_\mathbb{R}f(x)\mathrm{d}x=1$ exists and it is finite.

Is it possible to prove these statements?

Best Answer

It is possible to have infinite Shannon entropy when dealing with countably many outcomes. For example, $$ p_n = \frac{c}{n (\log n)^{1+\epsilon}}\quad n\geq 3 $$ where the constant $c=1/\sum_{n\geq 3}\frac1{n(\log n)^{1+\epsilon}}$ is chosen to satisfy $\sum_{n=3}^\infty p_n=1$. Since $\sum\frac1{n(\log n)^{1+\epsilon}}$ converges only for $\epsilon>0$ and $$ -p_n\log p_n=\frac{c}{n(\log n)^{1+\epsilon}}(-\log c+\log n+(1+\epsilon)\log\log n) \sim \frac{c}{n(\log n)^\epsilon} $$ we see that for $0<\epsilon\leq 1$ the sum $H=-\sum_n p_n\log p_n$ diverges.