[Math] multivariate Gaussian approximation in total variation distance

pr.probability

I'm wondering if there's any general technique that gives the total variation distance between a distribution on $\mathbb{R}^n$ and $N(0, I_n)$.

My understanding is that Stein's method gives only Wasserstein distance in higher dimension because the characterization of multivariate Gaussian is a second-order differential equation (while it is a first-order differential equation in one-dimensional case) so more regularity is required on test functions and thus it yields a weaker distance. And I understand that it is possible to improve Wasserstein distance to total variation distance if the distribution is log-concave.

What is the usual way to handle the total variation distance to multivariate Gaussian? I'm primarily interested in approximating $N(0,I_n)$ but the approximating distribution is not necessarily log-concave. Perhaps there's some easy way for this special case? Or is there any impossibility result?

Best Answer

Stein's method doesn't give total variation approximation in one dimension, either, without some kind of additional assumptions. This has nothing to do with Stein's method; for an impossibility result, any discrete distribution has maximal (1 or 2 depending on your normalization convention) total variation distance to any continuous (e.g. Gaussian) distribution. But of course you can approximate any distribution by a discrete distribution, in Wasserstein distance for example.

Related Question