[Math] An Entropy Inequality (generalized)

entropyit.information-theorypr.probabilityreal-analysisstatistical-physics

Let $X,Y$ be probability measures on $\{1,2,\dots,n\}$. For $0\le \alpha \le 1$, set $K=\sum_i X(i)^\alpha Y(i)^{1-\alpha}$ so that $Z:=\frac{1}{K}X^\alpha Y^{1-\alpha}$ is also a probability measure on $\{1,2,\dots,n\}$. How can we prove the inequality
$$\alpha H(X)+ (1-\alpha) H(Y)\geq K^2 H(Z),$$ where $H(X)=-\sum_{i=1}^n X(i)\log X(i)$ is the entropy function?

This is a small generalization of a recent question. The generalization is supported by a modest amount of numerical experimentation.

[I hope it's ok to post this generalization as a separate question. I'm not allowed to comment on the previous question, and this isn't an answer, so I didn't see an alternative.]

I've tried to translate the inequality into physics language (classical statistical mechanics), but I don't see a physical meaning.

Suppose $H$ is a hamiltonian — a self-adjoint operator acting on a Hilbert space, of finite dimension $n$, for simplicity. Write its eigenvalues $E_i$, $i=1,\ldots,n$. The partition function is $$Z(H) = \mathrm{tr}\; e^{-H} = \sum_i e^{-E_i}.$$ The density matrix is $$\rho = Z^{-1} e^{-H}.$$ (In the language of the original question, the probability measure is $X(i) = Z^{-1} e^{-E_i}$.)

The entropy is $$S(H) = – \mathrm{tr}\, (\rho \ln \rho).$$

Let $H_0$ and $H_1$ be two commuting hamiltonians. so they can be diagonalized simultaneously. Let $$H_\alpha = (1-\alpha) H_0 + \alpha H_1\,.$$ The conjectured inequality is $$(1-\alpha)S(H_0) + \alpha S(H_1) \ge K^2 S(H_\alpha)$$
where $$K = \frac{Z(H_\alpha)}{Z(H_0)^{1-\alpha}Z(H_1)^{\alpha}}.$$
I don't see a physical interpretation for the factor $K$.

Does the inequality hold without the assumption that $H_0$ and $H_1$ commute, i.e. in quantum statistical mechanics?

Best Answer

Not an answer, but a random thought This looks very close to the Lieb/Wigner/Yamase inequality, as in these very nice notes of Eric Carlen's.