Is there any intuition behind Chebyshev's inequality or is that only pure mathematics? What strikes me is that any random variable (whatever distribution it has) applies to that.
$$
\Pr(|X-\mu|\geq k\sigma) \leq \frac{1}{k^2}.
$$
inequalityintuitionprobability theory
Is there any intuition behind Chebyshev's inequality or is that only pure mathematics? What strikes me is that any random variable (whatever distribution it has) applies to that.
$$
\Pr(|X-\mu|\geq k\sigma) \leq \frac{1}{k^2}.
$$
Best Answer
Now apply this intuition to the functions $$g(x) = (x-\mu)^2 ~ \text{and}~ h(x)= \begin{cases}a^2,& |x - \mu| \geq a,\\0, & |x-\mu|< a,\end{cases}$$ where $a > 0$ and where $X$ is a random variable with finite mean $\mu$ and finite variance $\sigma^2$. This gives $$E[(X-\mu)^2] = \sigma^2 \geq E[h(X)] = a^2P\{|X-\mu|\geq a\}.$$ Finally, set $a = k\sigma$ to get the Chebyshev inequality.
Note that for a given value of $k$, equality will hold in the Chebyshev inequality when there are equal point masses of $\frac{1}{2k^2}$ at $\mu \pm k\sigma$ and a point mass of $1 - \frac{1}{k^2}$ at $\mu$. The central mass contributes nothing to the variance/moment-of-inertia-about-center-of-mass calculation while the far-away masses each contribute $\left(\frac{1}{2k^2}\right)(k\sigma)^2 = \frac{\sigma^2}{2}$ to add up to the variance $\sigma^2$