[Math] Question about probability of sum of random number of random variables

probability

Let $X_1, X_2,…$ be independent and identically distributed random variables with distribution $P(X_i = x) = p$ if $x=1$ and $P(X_i = x) = 1-p$ if $x=0$. Let N be a Poisson random variable with parameter $\lambda$, independent of the $X_i$'s. Find $P(S_n = k)$ for $k=0,1,2,…$ where $Sn = X_1 + X_2,…$

I'm really not sure how to get started on this homework problem. We have been doing a lot of stuff about conditional expectation and variance, and the laws of total probability and total expectation, but I'm really not sure how those tie into this question. I understand that we know the probabilities for $X_i$ and for $N$, and we know that $E[S_n] = E[X]E[N]$.

Any help on how to find the probability of $S_n$ will be much appreciated. Thanks in advance.

Best Answer

Given that $N=n$, the probability that $S_n=k$ is $$\binom{n}{k}p^k(1-p)^{n-k}.$$ Thus by the Law of Total Probability, $$\Pr(S_N=k)=\sum_{n=k}^\infty \binom{n}{k}p^k(1-p)^{n-k}\Pr(N=n).$$ We know an explicit formula for the probability that $N=n$. Substituting we get $$\Pr(S_N=k)=\sum_{n=k}^\infty \frac{n!}{k!(n-k)!}p^k(1-p)^{n-k}e^{-\lambda} \frac{\lambda^n}{n!}.$$ Change the index of summation, using $w=n-k$. Then after some cancellation our sum can be rewritten as $$e^{-\lambda}\cdot \frac{1}{k!}\cdot p^k\lambda^k\sum_{w=0}^\infty \frac{(\lambda(1-p))^w}{w!}.$$ We recognize the inner sum as $e^{\lambda(1-p)}$. Now a bit of manipulation yields $$e^{-\lambda p} \frac{(\lambda p)^k}{k!},$$ a very nice expression!