A possible generalization of Chebyshev’s inequality

inequalityprobabilityprobability theory

Let $X$ and $Y$ be identically and independently distributed random variables.

By Chebyshev's inequality we know that for any fixed $k > 0$
$$\mathbb{P}(X-Y > k) \leq \frac{\mathbb{E}[(X-Y)^2]}{k^2} = \frac{2\mathrm{Var}(X)}{k^2}.$$

My question is what happens if we allow $k$ above to be random?
Specifically, is there a meaningful bound for the following quantity:
$$\mathbb{P}(X-Y > Y).$$
Ideally, there would exist some constant $C > 0$ such that
$$\mathbb{P}(X-Y > Y) \leq C\mathrm{Var}(X).$$
Is there any reason to expect such an inequality would even hold?

Best Answer

In Chebyshev the constant depends only on $k$ and not on the distribution of the random variables. What should $C$ depend on in the new inequality? It cannot be independent of the distribution. But then you can simply take $C=\frac {P\{X-Y >Y\}} {Var (X)}$. To show that you cannot have a universal constant $C$ take $X=Y=-1$. An example with $var (X) \neq 0$: let $X,Y$ be i.i.d. normal with mean $0$ and variance $\frac 1 n$. Then LHS is indpendent of $n$ and is strictly positive wheras RHS $\to 0$; for large enough $n$ the inequality does not hold.

Related Question