[Math] A strange variant of the Gaussian log-Sobolev inequality

convex-analysisfa.functional-analysisinequalitiespr.probability

Let $\phi : \mathbb{R}^d \to \mathbb{R}$ be a convex function, and assume that it grows at most linearly at infinity for simplicity. Denote by $\gamma$ the standard Gaussian measure on $\mathbb{R}^d$, and assume further that $\int e^\phi \mathrm{d} \gamma = 1$. Define the probability measure
$$
d \mu := e^\phi \mathrm{d} \gamma.
$$

The following inequality, which I will call (sls) for "strange log-Sobolev", is true
$$
2 \int \phi(x) \mathrm{d} \mu(x) \le \int (\phi(x) – \phi(y))^2 \mathrm{d}\mu(x) \mathrm{d} \mu(y).
$$

Surprisingly, despite significant effort, the only proof I could find of (sls) uses arguments from stochastic control (which will be outlined below).

Q1: can one give a proof of (sls) that does not require, say, to know what a progressively measurable stochastic process is?

I will now explain the name and clarify why (sls) looks at least to some extent like the standard log-Sobolev inequality. Recall that the Gaussian log-Sobolev inequality states that, for every (say smooth) function $f : \mathbb{R}^d \to \mathbb{R}_+$ such that $\int f \mathrm{d}\gamma = 1$, we have
$$
2 \int f \log f \mathrm{d} \gamma \le \int \frac{|\nabla f|^2}{f} \mathrm{d} \gamma.
$$

Applying this inequality with $f = e^\phi$, we obtain
$$
2 \int \phi \mathrm{d} \mu \le \int|\nabla \phi|^2 \mathrm{d} \mu,
$$

which is surprisingly similar to (sls).

Q2: is this a fruitful analogy? Shouldn't inequality (sls), which does not depend explicitly on the dimension, be useful in some contexts?

The validity of (sls) is equivalent to the convexity of the following function:
$$
\lambda \mapsto \frac 1 \lambda \log \int e^{\lambda \phi(x)} \mathrm{d} \gamma(x) \qquad (\lambda > 0).
$$

Since this is not the point, I will not explain it in details, but for those familiar with it, let me sketch briefly the stochastic-control proof that the mapping above is convex. Denote by $(B_t)$ a standard $d$-dimensional Brownian motion, and write
$$
\frac 1 \lambda \log \int e^{\lambda \phi(x)} \mathrm{d} \gamma(x)
= \frac 1 \lambda \sup_{h} \mathbb{E}\left[ \lambda \phi \left( B_1 + \int_0^1 h_s \mathrm{d}s \right) – \frac 1 2 \int_0^1 h_s^2 \mathrm{d} s \right],
$$

where the supremum is over suitable progressively measurable $(h_s)$. Replacing $h$ by $\lambda h$, we find that
$$
\frac 1 \lambda \log \int e^{\lambda \phi(x)} \mathrm{d} \gamma(x) = \sup_{h} \mathbb{E}\left[\phi \left( B_1 + \lambda \int_0^1 h_s \mathrm{d}s \right) – \frac \lambda 2 \int_0^1 h_s^2 \mathrm{d} s \right].
$$

This is a supremum of convex functions, so we are done.

(EDIT: I have completely rewritten the question on January 27 2020 to emphasize (sls), which initially did not appear at all.)

Best Answer

Here is a simple proof that $$\frac1\lambda \mapsto \frac1\lambda \, \log \int \exp(\lambda \, \phi(x)) \, \mathrm d \gamma(x)$$ is convex. This does not need any assumptions on $\phi$ or $\gamma$. Maybe this can be used to prove convexity of your function (see below).

Let $M$ denote the set of measurable functions from $\mathbb R$ to $\mathbb R$. An application of Hölder's inequality shows that the LogIntExp-functional \begin{equation*} M \ni u \mapsto \log \int \exp(u(x)) \, \mathrm d \gamma(x) \end{equation*} is convex. Hence, \begin{equation*} \mathbb R \ni s \mapsto \log \int \exp(s \, \phi(x)) \, \mathrm d \gamma(x) \end{equation*} is convex. Consequently, its perspective \begin{equation*} \mathbb R^+ \times \mathbb R \ni (t,s) \mapsto t \, \log \int \exp(t^{-1} \, s \, \phi(x)) \, \mathrm d \gamma(x) \end{equation*} is convex. By fixing $s = 1$, \begin{equation*} \mathbb R^+ \ni t \mapsto t \, \log \int \exp(t^{-1} \, \phi(x)) \, \mathrm d \gamma(x) \end{equation*} is convex. If this function would be increasing in $t$ (here your assumptions may come into play), this would imply convexity of your function.

Related Question