[Math] Entropy of convolution of measures

convolutionentropyergodic-theorymeasure-theoryprobability theory

Let $G$ be a countable, discrete group, and let $\mu_1,\mu_2$ be probability measures on the group $G$. We define the entropy of $\mu_i$ as

$H(\mu_i)=\sum\limits_{g \in G}-\mu_i(g)\log(\mu_i(g))$ (with the convention that $0 \cdot \log0=0$).

Recall that convolution of $\mu_1$ and $\mu_2$ is defined as:

$(\mu_1 \star \mu_2)(g)=\sum\limits_{h \in G}\mu_1(gh^{-1})\mu_2(h)$.

Now, it's a fact that $H(\mu_1 \star \mu_2) \leq H(\mu_1)+H(\mu_2)$. The usual proof goes by showing that entropy decreases under factor maps.

My question is: Is there a completely hands on proof of this fact- preferably using the concavity of $\log x$ or convexity of $-x\log x$?

Best Answer

With no convexity... If $P$ and $Q$ are probability measures and $R=P\ast Q$, then $$H(R)=-\sum_xR(x)\log R(x)=-\sum_{x,y}P(y)Q(x-y)\log R(x).$$ For every $(x,y)$, $R(x)\geqslant P(y)Q(x-y)$, hence $$H(R)\leqslant-\sum_{x,y}P(y)Q(x-y)\log(P(y)Q(x-y))=-\sum_{x,y}P(y)Q(x)\log(P(y)Q(x)),$$ where the last identity holds because, for every $y$, the mapping $x\mapsto x-y$ is a bijection from $G$ to $G$. Thus, $$H(R)\leqslant-\sum_{y}P(y)\log(P(y))\left(\sum_xQ(x)\right)-\sum_{x}Q(x)\log(Q(x))\left(\sum_yP(y)\right).$$ Finally, $P$ and $Q$ are probability measures hence the two sums between parentheses are equal to $1$ and we are left with the inequality $$H(R)\leqslant-\sum_{y}P(y)\log(P(y))-\sum_{x}Q(x)\log(Q(x))=H(P)+H(Q).$$

Related Question