Conditional probability mass function independent binomial the conditional probability mass function of $X$ given that $X+Y =s$

conditional probabilityexpected value

Suppose $X$ and $Y$ are independent. $X$ is binomial based on $m$ trials and success probability $\theta$. $Y$ is binomial based on n trials and success probability

(i) What is the conditional probability mass function of $X$ given that $X+Y =s$?

Confusion in Calculating Conditional Probability mass function

How do they calculate that total probability in the denom? I have this as the numerator:

\begin{equation}
\left[nCx \text{ }\theta^x (1-\theta^{n-x})\right] \left[ mC(s-x) \text{ } \theta^{s-x} (1-\theta^{m – s-x}\right]
\end{equation}

Should I follow suit and break apart the denominator in a similar way using i and (s – i)? When I do that I can't tell if something is supposed to cancel or what?

Any hints for whats to come below are appreciated

(ii) What is $E(X|X + Y )$?

(iii) What is $Var(X|X + Y)$?

Best Answer

A way to simplify the denominator is to recognize that $X+Y \sim \text{Binomial}(m+n, \theta)$. So the denominator can be written as $\binom{n+m}{s} \theta^x (1-\theta)^{n+m-s}$. In the linked question they write as a sum by iterating over the events $\{X=i, Y=s-i\}$, which is equivalent but hard to simplify.

In the end you should get $$P(X=x \mid X+Y=s) = \frac{\binom{m}{x} \binom{n}{s-x}}{\binom{n+m}{s}},$$ i.e. the conditional distribution is a hypergeometric distribution. You can then cite/refer to derivations of the mean/variance for that distribution.