[Math] Chebyshev’s inequality for 1 standard deviation results in 0

probability distributionsstandard deviationstatistics

In applying Chebyshev's inequality to a probability distribution, the following is the given equation:

$$p(\mu – c*\sigma \le X \le \mu + c*\sigma) \ge 1 – \frac{1}{c^2}$$

This indicates for any value of "c" as a weight to the standard deviations, we have a probability that is greater or equal to $1 – \frac{1}{c^2}$. This means that for $c=2$, we have $1 – \frac{1}{2^2} = 0.75$, and for $c=2$, we have $0.89$. However, this also means that for $c=1$, we end up with $0$.

While this technically makes sense that the probability of values being within 1 standard deviation is greater than 0, unless I am mistaken, doesn't this just offer nothing analytically? The value of 0 basically just states that the probability of X exists, no?

Furthermore, for values less than 1 the logic breaks down as the probability becomes negative, so the use of $\ge$ cannot be correctly applied here. So I'm curious if the value of $c$ in this equation is limited to discrete integers, or if there is something that I am missing?

Best Answer

Chebyshev's Inequality is true for any $c > 0$, but you are right that it only provides useful information for $c > 1$.

This is actually surprisingly easy prove. Define $\mu = E(X)$ and $\sigma^2 = E((X-\mu)^2)$. Observe that for any $c \geq 0$ we have $\mathbb{1}\left\{\left|\frac{X-\mu}{\sigma}\right|\geq c\right\} \leq \frac{(X-\mu)^2}{\sigma^2 c^2}$ where $\mathbb{1}\{\cdot\}$ is an indicator function equal to 1 if the event inside the brackets occurs and 0 otherwise. (To see this, suppose $\left|\frac{X-\mu}{\sigma}\right| \geq c$. Then the left-hand side equals 1 but the right-hand side is $\geq$ 1. If $\left|\frac{X-\mu}{\sigma}\right| < c$, the left-hand side is 0 but the right-hand side is positive.)

Since the expectation operator is monotonic, we can take expectations of both sides to obtain $$ \Pr\left(\left|\frac{X-\mu}{\sigma}\right|\geq c\right) \leq \frac{E(X-\mu)^2}{\sigma^2 c^2} = \frac{1}{c^2} $$ Since $\Pr\left(\left|\frac{X-\mu}{\sigma}\right|\geq c\right) = 1- \Pr\left(\left|\frac{X-\mu}{\sigma}\right|\leq c\right)= 1 -\Pr(\mu- \sigma c \leq X \leq \mu + \sigma c)$, we can rearrange to obtain the desired inequality.

The proof illustrates how you could derive many other inequalities of this type. Any function $g$ that satisfies

$$\left|\frac{X-\mu}{c\sigma}\right|\geq 1 \Rightarrow g\left(\left|\frac{X-\mu}{c\sigma}\right|\right) \geq 1$$

will work. For example, you could use an identical proof to show $$ \Pr\left(\left|\frac{X-\mu}{\sigma}\right|\geq c\right) \leq \frac{E\left(|X-\mu|^k\right)}{\sigma^k c^k} $$ for all $k > 0$.