[Math] Entropy of a measure

gr.group-theoryit.information-theorymeasure-theorypr.probability

Let $\mu$ be a probability measure on a set of $n$ elements and let $p_i$ be the measure of the $i$-th element. Its Shannon entropy is defined by

$$
E(\mu)=-\sum_{i=1}^np_i\log(p_i)
$$

with the usual convention that $0\cdot(-\infty)=0$.

The following are two fundamental properties:

Property 1: $E(\mu)$ takes its minimum on the Dirac measures.

Property 2: $E(\mu)$ takes its maximum on the uniform probability measure.

Now, for some application, I am really interested in a possible generalization when $\mu$ is a finitely additive probability measure on the natural numbers.

Question: Is it possible to define a notion of entropy of a finitely additive probability measure on the natural numbers in such a way that it verifies the following properties:

  • it takes its minimum on the Dirac measures
  • it takes its maximum on the finitely additive translation invariant probability measures

Any reference? Idea?

Thanks in advance,

Valerio

Best Answer

I don't know if someone has already defined such entropy and I am not an expert on these things, but (depending what is wanted) I would start with something like $$E(\mu)=\sup\left\{-\sum_{i=1}^n\mu(A_i)\log(\mu(A_i)) : \mathbb{N} = \bigcup_{i=1}^n A_i, A_i \text{ pairwise disjoint}\right\}.$$

Some properties this entropy would have are

  1. It equals the Shannon entropy for measures that are concentrated on finitely many numbers.
  2. It gives value $+\infty$ for finitely additive translation invariant probability measures.

A related definition for arbitrary measure spaces $(X,m)$ is the relative Shannon entropy (a.k.a. the Kullback–Leibler divergence) $$E(\mu) = \int_X \frac{d\mu}{dm} \log\left(\frac{d\mu}{dm}\right)dm,$$ where $\frac{d\mu}{dm}$ is the Radon-Nikodym derivative of $\mu$ w.r.t. $m$. One could fix a finitely additive measure $m$ and try to work with that.

Related Question