Random sum of a function of Poisson random variables

conditional-expectationpoisson distributionprobabilityprobability distributionsprobability theory

Let $N$ be Poisson distributed with parameter $\lambda$. If $S = \sum\limits_{r=1}^{N}{X_r}$ where $\{X_r: r \geq 0\}$ are i.i.d distributed nonnegative integer valued RVs. We assume that the expectations exist and that $N$ is independent from $\{X_r\}_{r \geq 0}$. In this case, show that for any function $g$, we have
\begin{align*}
\mathbb{E}[S g(S)] = \lambda \mathbb{E}[g(S+X_0) X_0].
\end{align*}

I started by conditioning on $N=k$ but I ended up getting:
\begin{align*}
\mathbb{E}[S g(S)] = \sum_{k=0}^{+\infty}{\frac{\lambda^k}{k!} e^{-\lambda} \sum_{i=1}^{k}{\mathbb{E}\left[X_i g\left(\sum_{j=1}^{k}{X_j}\right)\right]}}.
\end{align*}

Best Answer

Since the random variables $X_i$, $i \in \{0,\ldots,k\}$, are independent and identically distributed, we have

$$\mathbb{E} \left( X_i g \left( \sum_{j=1}^k X_j \right) \right) = \mathbb{E}\left(X_1 g \left( \sum_{j=1}^k X_j \right) \right) = \mathbb{E} \left( X_0 g \left( \sum_{j=0}^{k-1} X_j \right) \right),$$

in particular, the left-hand side does not depend on $i$. Hence,

$$\sum_{i=1}^k \mathbb{E} \left( X_i g \left( \sum_{j=1}^k X_j \right) \right) = k \mathbb{E}\left(X_0 g \left( \sum_{j=0}^{k-1} X_j \right) \right)$$

and so

$$\begin{align*} \mathbb{E}(S g(S)) &= \sum_{k=1}^{\infty} k \frac{\lambda^k}{k!} \mathbb{E}\left(X_0 g \left( \sum_{j=0}^{k-1} X_j \right) \right) \\ &= \lambda \sum_{m=0}^{\infty} \frac{\lambda^{m}}{m!} \mathbb{E}\left(X_0 g \left( \sum_{j=0}^{m} X_j \right) \right)\end{align*}$$

Since $N$ is Poisson distributed with parameter $\lambda$ and independent from $(X_i)_{i \geq 0}$ this implies

$$\mathbb{E}(S g(S)) = \lambda \mathbb{E}(X_0 g(X_0 +S)) .$$