Max Decoupling Inequality – Probability Theory

pr.probability

Let $X_1,\ldots,X_n$ be $\{0,1\}$-valued random variables drawn from some joint distribution. Let $\tilde X_1,\ldots,\tilde X_n$ be their independent version: $\mathbb{E}X_i=\mathbb{E}\tilde X_i$ for each $1\le i\le n$ and the $\tilde X_i$ are mutually independent (as well as being independent of the $X_i$'s). I would like to compare
$M:=\mathbb{E}\max_{1\le i\le n}X_i$
and
$\tilde M:=\mathbb{E}\max_{1\le i\le n}\tilde X_i$.
It's obvious that $M$ can be made arbitrarily small, while $\tilde M$ is arbitrarily close to $1$, so in general, there can be no bound on $\tilde M/M$. But it seems that in the reverse situation, one should be able to say something. Is some bound of the form
$$ M \le c\tilde M $$
known, where $c$ is an absolute constant? If such a thing is impossible (what's the counterexample?), what's the best dependence on $n$ in the bound?

Best Answer

Yes, this inequality holds with $$c:=\frac e{e-1}.$$

Indeed, let $A_i:=\{X_i=1\}$. Then $$M=P\Big(\bigcup_i A_i\Big)\le\min(s,1)\le c(1-e^{-s}),$$ where $s:=\sum_i P(A_i)$. On the other hand, $$\tilde M=1-\prod_i(1-P(A_i)) \ge1-\prod_i\exp(-P(A_i))=1-e^{-s}.$$ So, $M\le c\tilde M$.

Related Question