The usual Markov inequality goes as follows: suppose $X$ is a random variable with $g \geq 0$. Then for any $r > 0$,
$$\mathbb{P}(g(X) \geq r) \leq \dfrac{\mathbb{E}[g(X)]}{r}\text{.}$$
Does the following statement still hold true?
$$\mathbb{P}(g(X) > r) < \dfrac{\mathbb{E}[g(X)]}{r}\text{.}$$
For context, suppose $\mathbb{E}[|X|] = 0$ and we want to find $\mathbb{P}(X > 1)$. A solution I read said that since $\mathbb{P}(X > 1) \leq \mathbb{P}(|X| > 1)$, we can show $\mathbb{P}(X > 1) = 0$ using the above.
I understand how the Markov inequality led to $\mathbb{P}(X > 1) = 0$, but I'm not comfortable using it in the $>$ case without justification.
Best Answer
The strict inequality will not be true in general, because it could be that $\mathbb{P}(g(X)>r)=\mathbb{P}(g(X)\geq r)=\frac{\mathbb{E}[g(X)]}{r}$.
But you can say that $\mathbb{P}(g(X)>r)\leq \mathbb{P}(g(X)\geq r)\leq \frac{\mathbb{E}[g(X)]}{r}$, which should be enough for most applications.