[Math] L1 distance between gaussian measures

fa.functional-analysismeasure-theorymg.metric-geometrypr.probabilityst.statistics

L1 distance between gaussian measures: Definition

Let $P_1$ and $P_0$ be two gaussian measures on $\mathbb{R}^p$ with respective "mean,Variance" $m_1,C_1$ and $m_0,C_0$ (I assume matrices have full rank). I know that calculating the L1 distance between $P_1$ and $P_0$:
$$d_1=\int|dP_1-dP_0|$$

The easy case

is an easy exercice when $C_0=C_1=C$:
$$d_1=2(2\Phi(\sigma/2)-1)$$
where
$$\sigma=\|C^{-1/2}(m_1-m_0)\|.$$
(norm of $\mathbb{R}^p$) and $\Phi$ is the cdf of a gaussian mean zero variance 1 reall variable.

I don't remember the name of $\sigma$ (RKHS norm? Cameron martin ?) but it can also be written: $\|\mathcal{L}\|_{L_2(P_{1/2})}$ where $\mathcal{L}$ is the log likelihood ratio function and $P_{1/2}$ is the normal distribution with mean $(m_1+m_0)/2$ and variance $C$.

My question is about how to extend that type of result for the case when $C_0\neq C_1$ (explicit calculation of the L1 distance).

I see two possible reductions of the problem if calculous are too complicated:

  1. search for an inequality relating the L1 distance and some norm of the likelihood ratio
  2. search for some exact expression in a particular case, for example $C_1$ and $C_0$ diagonal.

Reduction 1 gets a partial answer with the general inequality

$$d_1\leq 2\sqrt{K(P_1,P_0)}$$
(due to pinsker or Lecam I don't remember)
where
$$K(P_1,P_0)=\int \log \left(\frac{dP_1}{dP_0} \right ) dP_1$$
is the kullback divergence.

I am not really satisfyed with this answer since in the case $C_1=C_0$ it is suboptimal, it does not include an "half measure" $P_{1/2}$ (could include $(P_0+P_1)/2$ by using twice the inequality but I don't really like this interpolation),…

Best Answer

Explicit upper and lower bounds are obtained in Theorem 1.2 and Proposition 2.1 of The total variation distance between high-dimensional Gaussians.

Related Question