Talagrand’s Inequality for L1 Norm – Probability and Metric Geometry

fa.functional-analysismg.metric-geometrypr.probability

I have a series of $n$ independent random variables $X_1,\ldots, X_n$, each with the support $[0,1]$, and a monotone convex function $f:\mathbb{R}^n \rightarrow \mathbb{R}$ that is 1-Lipshitz in L1 norm, i.e., for every $x,y \in \mathbb{R}^n$, it holds that $f(x)-f(y) \leq \sum_{i=1}^{n} |x_i-y_i|$.
I want to have a concentration bound like Talagrand.

Is it true that $$Pr[ \mid f(X_1,\ldots,X_n) – E[f(X_1,\ldots,X_n)] \mid > t] \leq c_1 \cdot e^{-\frac{t^2}{c_2}} $$
for some constants $c_1,c_2>0$ (that are independent of $n$, and the distributions of $X_i$, and the function $f$, as long as the conditions hold)?

Do I need more conditions for the inequality to hold?

Best Answer

Yes, you do need more conditions. For instance, if $f(x_1,\dots,x_n)\equiv x_1+\dots+x_n$ and the $X_i$'s are (say) iid Bernoulli with parameter $1/2$, then $f$ is $1$-Lipschitz in the $L^1$-norm but, by the central limit theorem, your inequality will hold for all real $t>0$ only if $c_2\gtrsim n/2$ as $n\to\infty$.

Related Question