[Math] Lipschitz function of independent sub-Gaussian random variables

concentration-of-measure

If $X\sim \mathcal{N}(0,I)$ is a Gaussian random vector, then Lipschitz functions of $X$ are sub-Gaussian with variance parameter 1 by the Tsirelson-Ibragimov-Sudakov inequality (eg. Theorem 8 here).

Suppose $X = (X_1,X_2,\ldots, X_n)$ consisted of independent sub-Gaussian random variables themselves, which are not normally distributed. Does the above property still hold?

Best Answer

Here are two options that may suit your needs.

  1. Concentration inequality for convex functions of bounded random variables. If $X_1,...,X_n$ are independent taking values in in $[0,1]$ and $f$ is a quasi-convex, then \[P(f(X) > m+t) \le 2e^{-t^2/4}, P(f(X) < m - t) \le 2e^{-t^2/4} \qquad \] where $m$ is the median of $f(X)$. See Theorem 7.12 in the book Concentration Inequalities: A Nonasymptotic Theory of Independence by Gábor Lugosi, Pascal Massart, and Stéphane Boucheron. It follows from the convex distance inequality due to Talagrand.

  2. View $X_i$ as a function of a standard normal. If $X_i$ can be written as $\Phi(Z_i)$ where $Z_i$ is standard normal, then $f(X) = f\circ \Phi(Z)$ where $Z_1,...,Z_n$ are iid standard normal. Here, the multivariate function $\Phi:R^n\to R^n$ applies $\Phi$ on every coordinate.
    Then the Tsirelson-Ibragimov-Sudakov inequality applies to $f\circ \Phi$, and the Lipschitz norm of $f\circ \Phi$ is at most $\|f\|_{Lip} \|\Phi\|_{Lip}$. Now, the question is whether $\|\Phi\|_{Lip}$ is bounded by an absolute constant (and, in particular, whether $\Phi$ is Lipschitz at all, otherwise $\|\Phi\|_{Lip}=+\infty$ and we do not get anything). Inequality $\|\Phi\|_{Lip}<M+\infty$ holds, for instance, if $X_i$ is uniformly distributed on $[0,1]$, see Theorem 5.2.10 in the book High Dimensional Probability by Roman Vershynin where this approach is described.

  3. If $X$ has density $e^{-U(x)}$ for strongly convex $U:R^n\to R^n$. If $U$ is twice continuously differentiable and strongly convex in the sense that the Hessian $H$ of $U$ (i.e., $H_{ij} = (\partial/\partial x_i)(\partial/\partial x_i) U$ satisfies for all $x\in R^n$ that $H(x) - \kappa I_{n\times n}$ is positive semi-definite, then for any 1-Lipschitz function $f$ of $X$, \[ P( |f(X) - E[f(X)] | > t) \le 2 \exp(-\kappa c t^2) \] for some absolute constant $c>0$. This is Theorem 5.2.15 in the book High Dimensional Probability by Roman Vershynin.