[Math] Conditional Probability Involving Sum of Bernoulli Trials

binomial distributionprobabilityprobability theory

I have $X_i (i=1,…,n)$ which has $\text{Bernoulli}(p)$ or $\text{binomial}(1,p)$ distribution and each $X_i$ is independent with $X_i=1$ with probability $p$ and $X_i=0$ with probability $(1-p)$. I know that $S_n=X_1+\dots+X_n$ follows $\text{binomial}(n,p)$ distribution.

I am having trouble finding the conditional probability of $X_1=1$ given $S_n=k$. I am using the formula:
$$P(X =x|Y =y)= P(X =x\text{ and }Y =y)/P(Y = y).$$
But I am having trouble finding the numerator. I am new to probability so any hints and suggestions would be appreciated.

Best Answer

The probability that $S_n = k$ is of course $\binom{n}{k}p^k(1-p)^{n-k}$.

For $X_1$ to be $1$ and also $S_n = k$ two things must be true: $X_1 = 1$ (probability $p$) and $k-1$ among the remaining $n-1$ $X$-es must be $1$ (probability $\binom{n-1}{k-1}p^{k-1}(1-p)^{n-1-(k-1)}$).

So the conditional probability you want is $$ \frac{p \binom{n-1}{k-1}p^{k-1}(1-p)^{n-k}}{\binom{n}{k}p^k(1-p)^{n-k}}= \frac{\binom{n-1}{k-1}}{\binom{n}{k}}=\frac{(n-1)!}{(k-1)!(n-k)!}\frac{k!(n-k)!}{n!} = \frac{k}{n} $$

Now you can slap your forhead, because it was trivial to see that $k$ out of the $n$ variables were $1$, so the probability of any specific variable being $1$ would have to be $\frac{k}{n}$. All that work for such an easy result!