Probability that a probability will be less than a certain value

probabilityprobability theorystatisticsstochastic-analysisstochastic-processes

Suppose I have a nonnegative random variable $X$ and I don't know its expected value, but I know that its expected value is less than or equal to $a$ with at least probability $p^*$. i.e, $\mathbb{P}(\mathbb{E}(X)\leq a)\geq p^*$. In this scenario, what can I say about $X$? I want to be able to say something like $\mathbb{P}(X\leq \delta)\geq p$. Is there anyway I can derive $\delta$ or $p$ from the given information? The following is my attempt and it is so weird. What would be a standard way to make such a statement?

$\mathbb{P}(\mathbb{E}(X)\leq a)\geq p^*$. For a fixed $\epsilon>0$,

$\iff \mathbb{P}\left(\cfrac{\mathbb{E}(X)}{\epsilon}\leq \cfrac{a}{\epsilon}\right)\geq p^*$

By Markov's inequality,
$\implies \mathbb{P}\left(\mathbb{P}\left(X\geq \epsilon\right)\leq \cfrac{a}{\epsilon}\right)\geq p^*$.

Best Answer

Suppose that we have a finite measure space $(\mathbf{X}, \mathcal{X}, \mathbb{Q})$, where each element $X \in \mathbf{X}$ is a random variable $\Omega \to \mathbb{R}$ for some probability space $(\Omega, \mathcal{F}, \mathbb{P})$. Then, we can write:

$$\mathbb{Q}(\mathbb{E}[X] \le a) \le \mathbb{Q} \left( \mathbb{P}(X \ge \varepsilon) \le \frac{a}{\varepsilon} \right)$$

for any $\varepsilon > 0$, by identical reasoning to that in your post.

The reason why your working looks unusual is because it uses $\mathbb{P}$ for both the measure on $(\mathbf{X}, \mathcal{X})$ and $(\Omega, \mathcal{F})$, which does not really make sense formally. Moreover, it is necessary to first be able to define a sensible measure space $(\mathbf{X}, \mathcal{X}, \mathbb{Q})$; for instance, with parametric distributions, as Karl suggests.

You may also find it interesting to consider the product space $(\Omega \times \mathbf{X}, \mathcal{F} \otimes \mathcal{X}, \mathbb{P} \otimes \mathbb{Q})$.

Edit: If we have a random variable $W : \Lambda \to \mathbb{R}$ defined on another probability space $(\Lambda, \mathcal{L}, \lambda)$, and a family of random variables $X_w : \Omega \to \mathbb{R}$ (for each $w \in W(\Lambda) \subseteq \mathbb{R}$), we could define $(\mathbf{X}, \mathcal{X}, \mathbb{Q})$ with reference to $W$ as:

$$\mathbf{X} = \{X_w : w \in \mathbb{R}\}$$

$$\mathcal{X} = \{X_A : A \in \mathcal{B}(\mathbb{R}) \}$$

where $X_A := \{X_w : w \in A\}$, and $\mathcal{B}(\mathbb{R})$ is the Borel $\sigma$-algebra on $\mathbb{R}$, and

$$\mathbb{Q}(X_A) = \mu_W(A)$$

where $\mu_W = \lambda \circ W^{-1}$ is the law of $W$.

Then, for instance:

$$\mathbb{Q}(X : \mathbb{E}[X] \le a) = \mu_W ( w : \mathbb{E}[X_w] \le a) = \lambda(\mathbb{E}[X_W] \le a)$$


For example, suppose that $X \sim \mathcal{U}[0, W]$, where $W \sim \mathcal{U}[0,1]$. Then, $\mathbb{E}[X_w] = \frac{w}{2}$, so for $a \ge 0$,

$$\mathbb{Q}(\mathbb{E}[X] \le a) = \mu_W (\mathbb{E}[X_w] \le a) = \mu_W \left( \frac{w}{2} \le a \right) = \min(2a,1)$$.