[Math] Conditional Expectation of Random Sum of Random Variables

conditional-expectationprobabilityprobability theory

Given two random variables $X,Z$, both integrable, the conditional expectation $E[X | Z]$ is defined as a random variable which is $\sigma(Z)$-measurable, and satisfies $\int_A E[X | Z] dP = \int_{A} X dP$ for all $A \in \sigma(Z)$, which is the sigma algebra generated by $Z$.

My question is : we know that conditional expectation is linear, however what if the factor of linearity depends on another random variable occuring within the expectation?

For, suppose that $Y$ is a random variable with finite variance, and $N$ is an integer valued random variable, with finite variance. I want to show that if $Z = \sum_{i=1}^N Y_i$, where $Y_i$ are iid copies of $Y$, then $E[Z|N] = E[Y]N$.

So, is the following permitted?
$$
E[Z | N] =^? \sum_{i=1}^N E[Y_i | N] = E[Y_i] + … + E[Y_i] = E[Y]N
$$

If not, then how would I approach this problem, because proceeding by definition also means I get stuck.

Best Answer

The short answer is yes: one example is the Compound Poisson distribution.

Observe that $$ \begin{align} E[Z | N] &= E[Y_1 + \dots + Y_N| N] \\ &= E[E[Y_1 + \dots + Y_N| N]|N] \\ &= E[E[Y_1| N] + \dots + E[Y_N| N]|N] \\ &= E[E[Y| N] + \dots + E[Y| N]|N] \\ &= E[N\cdot E[Y| N]|N] \\ &= E[N\cdot E[Y]|N] \\ &= N\cdot E[Y] \end{align} $$

Related Question