Probability Distributions – Covariance of Poisson and Conditional Binomial Random Variables

binomial distributioncovariancepoisson distributionself-study

Problem Statement

Let $X$ and $Y$ be random variables such that $X
\sim \text{Poisson}(\lambda)$ and $Y|X \sim \text{Binomial}(x+1,p)$. Find $\text{Cov(X,Y)}$.

Attempt at a Solution

I would like to be able to write $Y$ as $X + Z$ where $Z$ is Poisson and independent of $X$, since then by this question the variance is easy to compute. To this end, write $Y|X$ as a sum of independent binomials:
$
Y|X = Z_1|X + Z_2|X,
$
where
$Z_1|X \sim \text{Binomial}(1,p)=\text{Bernoulli}(p)$ and $Z_2 | X \sim \text{Binomial}(x,p).$ Then $Z_2 \sim \text{Poisson}(\lambda p)$, which we can obtain by directly computing the pdf:
\begin{aligned}
P(Z_2 = z) &= \sum_x P(Z_2=z, X=x)\\
&= \sum_x P(Z_2=z\mid X=x)P(X=x) \\
&= \sum_{x=y}^\infty \frac{e^{-\lambda} \lambda^x}{x!}\frac{x!}{z!(x-z)!}p^z(1-p)^{x-z} \\
&= \frac{(\lambda p)^z}{z!}e^{-\lambda p} \sum_{x=z}^\infty \frac{[\lambda(1-p)]^{x-z} e^{-\lambda(1-p)}}{(x-z)!} \\
&= \frac{(\lambda p)^z}{z!}e^{-\lambda p}.
\end{aligned}
However, I cannot get the same approach to simplify for $Z_1|X$, which instead involves the CDF of $X$. I also tried this approach on $Y|X$ directly, hoping to obtain a sum in the form of the expectation of $X$, but I can't seem to reduce it. Even if I could, I'd still need to compute $E[XY]$ for the covariance. I suspect there may be a better approach than grinding through several more sums.

Best Answer

Hint:

$X$ can take on integer values in $[0,\infty)$. Given any value of $X$, say $X = k$, $Y$ can take on any integer value in $[0,k+1]$. Thus, for any given integer $k \in [0,\infty)$, \begin{align}P\{X = k, Y = \ell\} &= P\{X = k\}P\{Y = \ell\mid X = k\}\\ &= \begin{cases}\displaystyle e^{-\lambda}\frac{ \lambda^k}{k!} \cdot \binom{k+1}{\ell}p^{\ell}(1-p)^{k+1-\ell}, & 0 \leq \ell \leq k+1,\\ \\ 0, & \ell > k+1,\end{cases}\end{align} and you can compute $\operatorname{cov}(X,Y)$ from the joint mass function.


More simply, use the law of iterated expectation to get that \begin{align}E[XY] &= E\left[E[XY\mid X]\right]\\ &= E\left[X\cdot E[Y\mid X ]\right]\\ &= E[X (X+1)p]\end{align} where I will leave the computation of that last expectation of a function of $X$ to you. Note that $E[X]$ is known, and there is a hidden nugget in the above series of equalities that will tell you what $E[Y]$ is Put all this together with $\operatorname{cov}(X,Y) = E[XY]-E[X]E[Y]$ to find the answer that you seek.

Related Question