[Math] Upper bound on the entropy of a sum two random variables

entropyinequalityinformation theoryprobability

Let $X$ be a random variable such that $|X| \leq A$ almost surely, for some $A > 0$. Let $Z$ be independent of $X$ such that $Z \sim {\cal N}(0, N)$. My question is:

How large can the entropy $h(X+Z)$ be?

A simple upper bound obtained by considering the maximum possible variance of $X+Z$ is $\frac{1}{2} \log (2\pi e (A^2 + N))$. Is there a better bound in terms of $A$ and $N$?

Best Answer

Just some hints.

We can consider this as an additive gaussian channel, with noise $Z$, bounded input $X$ and output $W=X+Z$. Furthermore, $h(W) = h(Z) + h(X) - h(X|W)=h(Z)+I(X;W)$

For the first equality see here. Because $h(Z)$ is fixed, our problem of maximizing $h(W)$ is then equivalent to finding the pdf for $X$ that maximizes the mutual information, that is, to find the capacity of this channel. This is not trivial - see eg the introduction of this paper.

To get some bound, I'd try with the distribution that achieves that capacity in some cases: two Dirac deltas at $X=\pm A$ with same weight. In this case, the pdf of $W$ would be the mix of two Gaussians, and its entropy is studied here (no simple close result, but you might get some useful bound).