Computing conditional expectation

conditional probabilityconditional-expectationprobabilityprobability theory

Consider a collection $(X_i)$ of i.i.d. random variables in $L^1(\mathbb{P})$, let $N$ be a Poisson($1$)-distributed random variable, independent from the collection $(X_i)$, and set $Z := \sum^N_{i=1} X_i$. Compute $\mathbb{E}(Z|N = n)$ for $n\in \{0,1,2,\dots\}$ and compute the distribution function of the random variable $\mathbb{E}(Z|N)$.

The solution of the first part is given by:

Let $S_N=\sum^N_{i=1} X_i$. Note that the number of terms in the sum is random. Given that $N=n$, $S_N$ is a sum with a fixed number of terms: $\sum^n_{i=1} X_i$. So $\mathbb{E}(S_N|N=n)=n\mathbb{E}(X_1)$. Thus $\mathbb{E}(S_N|N)=N\mathbb{E}(X_1)$.

I know this: Let all random variables defined be in a probability space $(\Omega,F, \mathbb{P})$. Let $X_1$, $X_2$,$\dots$ be independent and identically distributed random variables in $L^1(\mathbb{P})$ taking values in $\{0, 1, 2,\dots\}$. Suppose that $T$ is an integrable random variable in $\{0,1,2,\dots\}$ independent of $(X_1,X_2,\dots)$ and set for every $\omega\in\Omega$ $S(\omega) := \sum^{T(\omega)}_{i=1} X_i(\omega)$, then $\mathbb{E}(S)=\mathbb{E}(T)\mathbb{E}(X_1)$. $(*)$

My attempt to the first exercise:

$\mathbb{E}(Z|N=n)\stackrel{def}=\frac{\mathbb{E}(Z\mathbb{1}_{\{N=n\}})}{\mathbb{P}(N=n)}\stackrel{X_i\text{ indep of } N}=\frac{\mathbb{E}(Z)\mathbb{E}(\mathbb{1_{N=n}})}{\mathbb{P}(N=n)}=\mathbb{E}(Z)\stackrel{(*)}=\mathbb{E}(N)\mathbb{E}(X_1)\stackrel{N\sim Pois(1)}=\mathbb{E}(X_1)$

But the result that i get is not a random variable, which I should get, since conditional expectations are random variables, but I don't get why the argument is wrong…

Best Answer

There is a mistake in the second line of the display. $$E(Z 1_{\{N=n}\})=nE(X_1)P(N=n)$$ is the correct answer, so $E(Z|N=n)=nE(X_1)$ and $E(Z|N)=NE(X_1)$.

While $N$ is independent of the $X_i$ it is not independent of $Z$.

Related Question