Is a Binomial a Sum of Bernoulli Random Variables?

probabilityprobability theorystatistics

I know that, if $X_1,\ldots,X_n\sim \text{Ber}(p)$ are independent, then $X_1+\ldots+X_n\sim\text{Bin}(n,p)$.

My question is: if $X\sim \text{Bin}(n,p)$, is it true that there exist independent random variables $X_1,\ldots,X_n\sim\text{Ber}(p)$ such that $X(\omega)=X_1(\omega)+\ldots+X_n(\omega)$ for all $\omega\,$?

EDIT: I would like an analytic proof of this fact (if it is true).

Best Answer

In general, no. As @Did pointed out, we can construct a $\operatorname{Bin}(n,p)$ random variable on a probability space with $n+1$ elements. Consider $\Omega = \{0,1,2\}$ with \begin{align}\mathbb P(\{0\})&=(1-p)^2\\\mathbb P(\{1\})&= 2p(1-p)\\ \mathbb P(\{2\})&=p^2 \end{align} and $X(\omega)=\omega$. If $X=X_1+X_2$ where $X_1, X_2$ take values in $\{0,1\}$ then from \begin{align} 0 &= X(0) = X_1(0) + X_2(0)\\ 2 &= X(2) = X_1(2) + X_2(2) \end{align} we see that $X_1(0)=X_2(0)=0$ and $X_1(2)=X_2(2)=1$. It is clear that $X_1$ and $X_2$ cannot be independent.

An example where the answer is "yes" is given by the $\Omega = \{H,T\}^n$ (i.e. the outcomes are $n$ tosses of a coin), $$\mathbb P(\{\omega\}) = p^j (1-p)^{n-j},\; \omega\in\Omega$$ where $j$ is the number of "heads" in $\omega$, and $X$ again the number of "heads" in $\omega$. For $\omega = (\omega_1,\ldots,\omega_n)$, define $X_j$ by $X_j(\omega) = \mathsf 1_{\omega_j=1}$. It follows from the binomial theorem that $$\mathbb P(X_j=1) = p = 1 - \mathbb P(X_j=0),\; j=1,2,\ldots,n, $$ and by construction, $$X=\sum_{j=1}^n X_j$$ with the $X_j$ independent.

Related Question