let $X_1,X_2$ be independent and identically distributed Bernoulli random variables with parameter p ∈ (0, 1).Prove that the sum $Y=X_1+X_2$ of these random variables is a binomial random variable with parameters 2 and p
[Math] Prove the sum of 2 Bernoulli is binomial
binomial distributionprobability
Related Solutions
Note that $X=(X_1,...,X_k)=Y_1+\cdots Y_n$ where the $Y$'s are i.i.d. with $$Y_1=(X_{1,1},...,X_{k,1}) \sim \text{Multinomial}(p_1,..,p_k,1).$$ Denote $S_{j,1}=X_{1,1}+\cdots X_{j,1}$ for $j<k$. Then we see that \begin{align*} P(S_{j,1}=1) &= P(S_{j,1}=1,S_{n,1}=1)=P(\bigcup_{k=1}^j(X_{k,1}=1,X_{l,1}=0 \, \forall l\not=k)) \\ &=\sum_{k=1}^j P(X_{k,1}=1,X_{l,1}=0 \, \forall l\not = k) \\ &= \sum_{k=1}^j p_k \end{align*} and $$P(S_{j,1}=0)=1-\sum_{k=1}^j p_k.$$ Hence $$S_{j,i} \sim \text{Ber} \left(\sum_{k=1}^j p_k\right) \quad \quad \text{ for } i\in\{1,...,n\},$$ which furthermore by the independence of the Y's are mutually independent. Now note for $j<k$ that $$Y=X_1+\cdots X_j=S_{j,1}+\cdots +S_{j,n}\sim \text{Bin}\left(n,\sum_{k=1}^j p_k\right)$$
The answer is no.
Consider the probability space $(\Omega, \mathcal{F}, \mathbb{P})$, where $\Omega = \{0,\ldots,n\}$, $\mathcal{F} = \mathcal{P}(\Omega)$ and $\mathbb{P}: \mathcal{F} \to [0,1]$ defined as $\mathbb{P}(A) = \sum\limits_{i \in A} \binom ni p^i(1-p)^{n-i}$ for every $A \in \mathcal{F}$.
Consider now $X: \Omega \to \{0,\ldots, n\}$ as $X(i)=i$ (the identity function).
The distribution of $X$ is clearly binomial.
Suppose now that $X = X_1 + \ldots + X_n$ with $X_i$ iid with law Bernoulli of parameter $p$, all defined on $(\Omega, \mathcal{F}, \mathbb{P})$.
This would mean that there is $A \in \mathcal{F}$ such that $\mathbb{P}(A) = p$.
And this must be true for every $p \in (0,1)$.
Consider the map $F: (0,1) \to \mathcal{P}(\{0,\ldots,n\})$ that for a given $p$ returns such a set $A$. Since the domain is infinite and the codomain is finite there must be a given $A \subseteq \{0,\ldots,n\}$ such that for infinite many $p$ we have $$\sum\limits_{i \in A} \binom ni p^i(1-p)^{n-i} = p$$ Therefore we have $\sum\limits_{i \in A} \binom ni x^i(1-x)^{n-i} = x$ for every real $x$ (polynomial identity).
And now, for example with $n=2$, we obtain that the LHS can be equal to $0,1,x^2,(1-x)^2,2x(1-x),1-x^2,1-(1-x)^2,1-2x(1-x)$, none of which is equal to $x$.
Best Answer
If $X_1, X_2$ are two iid Bernoulli random variables, then their sum is the count of successes in two iid Bernoulli experiments. This is the definition of a binomial random variable with parameters $2$ and $p$.
That is all. $\Box$
If you need to demonstrate further:
The probability that there will be exactly $y$ successes among these two variables is determined by measuring the probability of $y$ successes in a row followed by $2-y$ failures, then multiply this by the count of distinct ways to order these results.
That is: $$\mathsf P(Y=y) = \underline{\qquad\qquad?}$$
Which was to be demonstrated. $\Box$
Alternatively, from first principles we use the Law of Total Probability to show that when we partition the results by the first variable:
$$\begin{align} \mathsf P(Y=y) &= \mathsf P(X_1=1, X_2=y-1) + \mathsf P(X_1=0, X_2=y) \\[1ex]& = p\cdot \mathsf P(X_2=y-1) + (1-p)\cdot \mathsf P(X_2=y) \\[1ex]& = \begin{cases} \underline\qquad & : y =0 \\ \underline\qquad & : y=1 \\ \underline\qquad & : y=2 \end{cases} \end{align}$$
Which was to be demonstrated. $\Box$
Fill in the blanks if required. But really, you don't need to go passed the first tombstone.