[Math] Markov’s inequality tight in general

probabilityprobability theory

From here:

Even though Markov’s and Chebyshev’s Inequality only use information about the expectation and thevariance of the random variable under consideration, they are essentially tight for a general random variable.

So for a non-negative random variable $X$, if $P(X\geq a)$ is bound tightly by Markov's equation, we have:
$$\mathbb{E}(X) = \int_0^a xf(x) dx + \int_a^\infty xf(x)dx = \int_a^\infty af(x)dx $$
$$0\leq\int_a^\infty (x-a) f(x)dx=-\int_0^a xf(x) dx \leq 0$$
$$\int_a^\infty (x-a) f(x)dx = \int_0^a xf(x) dx = 0$$
This doesn't seem to hold in general at all.

Best Answer

Let $a>0$ be fixed. Note that $X-a1_{X\geq a}\geq 0$. In the equality case of Markov's inequality, this non-negative r.v has expectation $0$, thus $X-a1_{X\geq a} = 0$ a.s, that is $X=a1_{X\geq a}$ a.s. Hence almost surely $X\in \{0,a\}$.

Consider $X$ a discrete r.v. such that $P(X=a)=\lambda$ and $P(X=0)=1-\lambda$. It is non-constant and you can check that $P(X\geq a)=\frac{E(X)}a$.

Equality in Chebyshev’s Inequality implies equality in Markov's inequality for the random variable $|X-E(X)|$. So almost surely $|X-E(X)|\in \{0,a\}$.

Consider $X$ a discrete r.v. such that $P(X=a)=P(X=-a)=\frac 12$. It is non-constant and you can check that $P(|X-E(X)|\geq a)=\frac{V(X)}{a^2}$.