Proving That a Version of the Law of Total Probability Follows from Adam’s Law

conditional probabilityconditional-expectationprobability

I have a homework question that asks:

Show that the following version of LOTP follows from Adam’s law: for any event A and continuous random variable X with PDF $f_X$:

$$ P(A) = \int_{- \infty}^{\infty} P(A|X=x)f_X(x) dx $$

[Edited to add: Adam's Law is also the Law of Total Expectation and the Law of Iterated Expectation, and my text gives it as: $E(Y) = E(E(Y|X))$]

Here is the proof I have written:

$P(A) = E(I_A)$ and $P(A|X = x) = E(I_A|X = x)$ by the fundamental bridge.

Let $E(I_A|X = x) = g(x)$, a function of x, then:

$E(g(X)) = \int_{- \infty}^{\infty} g(x)f_X(x) dx $ (This is a formula I found in my text)

$= E(E(I_A|X)) = E(I_A)$ (by Adam's)

$= P(A)$ (by the bridge)

Therefore,

$P(A) = \int_{- \infty}^{\infty} E(I_A|X)f_X(x) dx = \int_{- \infty}^{\infty} P(A|X)f_X(x) dx $

My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete. However I'm not sure how to cope with this, because I am asked to connect the given (continuous) formula to Adam's, which requires expectation, and connecting probability to expectation requires indicator variables.

Best Answer

Therefore,

$\mathsf P(A) = \int_{- \infty}^{\infty} \mathsf E(I_A|X)f_X(x) dx = \int_{- \infty}^{\infty} \mathsf P(A|X)f_X(x) dx $

My main concern is that in this proof, I have dropped the expected value of the indicator variable into the integral, but I believe indicator variables are always discrete.

It is not a concern.

You are not using the indicator random variable, but its conditional exectation, $\mathsf E(\mathrm I_A\mid X)$, and you actually want to use the function of $x$, $\mathsf E(\mathrm I_A\mid X{=}x)$ inside the integral.

$$\begin{align}\mathsf P(A) &= \mathsf E(\mathrm I_A)\\&=\mathsf E(\mathsf E(\mathrm I_A\mid X))\\& = \int_{-\infty}^{\infty} \mathsf E(\mathrm I_A\mid X{=}x)\,f_X(x)~\mathsf dx &~:~& \mathsf E(g(X))=\int_\Bbb R g(x)~f_X(x)~\mathsf d x \\ & = \int_{-\infty}^{\infty} \mathsf P(A\mid X{=}x)\,f_X(x)~\mathsf dx \end{align}$$

In short the Law of Total Probability Is: $\mathsf P(A)=\mathsf E(\mathsf P(A\mid X))$ .

Related Question