Solved – Entropy of generalized distributions

entropy

What's the entropy of the following generalized probability distributions?

$P_1(x) = \delta(x)$

$P_2(x,y) = \delta(x+y)$, for $0\le x\le 1$, and $P_2(x,y)=0$ otherwise.

Integrals of the type $-\int \delta(x) \ln\delta(x) \mathrm{d}x$ seem to diverge to $-\infty$ (see here). But entropy is supposed to be positive. What's going on here? How can I compute the entropy of these distributions? Is there a way to define entropy for these distributions?

Best Answer

Typical Shannon entropy, on discrete set of probabilities, needs to be positive, as it is average of non-negative numbers, i.e.

$$\sum_i p_i \left(\tfrac{1}{p_i}\right).$$

Differential entropy need not to be positive. It is

$$\int p(x) \log\left(\tfrac{1}{p(x)}\right) dx,$$

which does not need to be positive. $p(x)$ is probability density, so it can be greater than one, making $\log(\tfrac{1}{p(x)})$ negative. In fact differential entropy can be viewed as Shannon entropy, where we do limit for infinitesimally small boxes and subtract $\log(1/\epsilon)$ (i.e. box size), otherwise the limit diverges:

$$ \lim_{\epsilon\to\infty} \sum_i p_{[i\epsilon, (i+1)\epsilon]} \log\left(\tfrac{1}{p_{[i\epsilon, (i+1)\epsilon]}}\right) $$ $$ \approx \lim_{\epsilon\to\infty} \sum_{i} p(i \epsilon)\epsilon \log\left(\tfrac{1}{p(i \epsilon)\epsilon}\right) $$ $$ = \lim_{\epsilon\to\infty} \left(\sum_{i} p(i \epsilon)\epsilon \log\left(\tfrac{1}{p(i \epsilon)}\right) + \log(1/\epsilon) \right) $$ $$ = \int_x p(x) \log\left(\tfrac{1}{p(x)}\right) dx + \lim_{\epsilon\to\infty}\log(1/\epsilon) $$

For Dirac delta differential entropy is $-\infty$, so you are right.

Related Question