Conditional expectation for mixture distribution

bayes-theoremprobability

Suppose I have a continuous random variable $x$ with CDF $G(x)$ and PDF $g(x)$. There is another random variable $s$ which equals $x$ with probability $1-\alpha$, or is again distributed according to $G(x)$ with probability $\alpha$, i.e., the conditional distribution $F(s|x)$ is given by

$$ F(s|x) = \alpha G(s) + (1-\alpha) \mathbb{1}_{\{s=x \}}$$

where $\mathbb{1}_{\{\cdot\}}$ denotes the indicator function.

i.e., there is both a continuous and a discrete part in the mixture distribution $s$.

Now I'm interested in the conditional expectation $\mathbb{E}(x | s)$. For continuous RVs I would simply compute this as

$$ \mathbb{E}(x | s) = \int g(x|s) dG(x) $$

where the conditional density is derived by Bayes' rule as

$$ g(x|s) = \frac{f(s|x) g(x)}{\int f(s|y) dG(y)}$$

where s's conditional PDF is denoted as $f(s|x)$.

However, since $s$ is a mixture, the conditional density can only be one part of the story. How do I factor the discrete part into the conditional expectation?

Best Answer

Okay, I think I finally get the question. So let's say $X$ has $\mu$ distribution which is absolutelly continuous with density $g$ and CDF $G$. Let's say that $\delta_a$ is delta dirac distribution, that is $\delta_a(A) = \begin{cases} 1 & a \in A \\ 0 & a \not \in A \end{cases}$. Moreover, when we know the value of $X$, then random variable $S$ is distributed $\nu(\cdot |X)$, where $\nu(A|x) = \alpha \mu(A) + (1-\alpha)\delta_x(A)$.

So look at $S$ only and try to find it's distribution: $\mathbb P(S \in A) = \mathbb E[\mathbb P(S \in A | X)] = \mathbb E[ \nu(A|X)] = \alpha \mu(A) + (1-\alpha) \mathbb E[\delta_X(A)] = \alpha \mu(A) + (1-\alpha)\mathbb P(X \in A) = \mu(A)$, so $S$ is distributed exactly the same as $X$.

Moreover, looking at joint behaviour: $\mathbb P( (X,S) \in A \times B ) = \mathbb E[ \mathbb P(X,S) \in A \times B | X) ] = \mathbb E [\mathbb E[ 1_{ \{(X,S) \in A \times B \}} | X ]] = \mathbb E[ 1_{ \{X \in A\}} \mathbb E[ 1_{\{S \in B\}} | X]] = \mathbb E[1_{\{X \in A\}} \nu(B|X)] = \alpha\mu(B)\mathbb E[1_{\{X \in A\}}] + (1-\alpha)\mathbb E[1_{\{X \in A\}} 1_{\{ X \in B \}}]= \alpha \mu(A)\mu(B) + (1-\alpha) \mu(A \cap B)$

We can guess the conditional distribution of $X$ given $S$. The guess is:

$\mathbb P(X \in A |S) = \alpha \mu(A) + (1-\alpha)1_{\{S \in A\}}$

Why? Note it is $S$ measurable random variable, moreover for any $C = \{S \in B\}$ we get: $\mathbb E[(\alpha \mu(A) + (1-\alpha) 1_{\{S \in A\}} ) 1_{\{ S \in B \}}] = \alpha \mu(A) \mathbb P(S \in B) + (1-\alpha) \mathbb P( S \in A \cap B) $ so using the fact that $S$ has the $\mu$ distribution, too, we get the result:

given $S$, random variable $X$ is distributed as $\lambda(\cdot | S)$, where $\lambda(A|s) = \alpha \mu(A) + (1-\alpha) \delta_s(A)$

Now: $$\mathbb E[X | S ] = \int x d\lambda(x|S) = \int x d(\alpha \mu(x) + (1-\alpha) \delta_S(x)) = \alpha \int x d\mu(x) + (1-\alpha) \int x d\delta_S(x) $$ Since $$ E[X] = \int x d\mu(x)$$ and $$ \int_x d\delta_S(x) = S$$ then our result is:

$$ \mathbb E[X|S] = \alpha \mathbb E[X] + (1-\alpha)S$$

Related Question