[Math] Random Variables Prove Independent

independenceprobabilityprobability distributionsrandom variables

Consider three Bernoulli random variables $B_1, B_2, B_3$ which take values $\{0, 1\}$ with equal prob- ability. We construct the following random variables $X, Y, Z: X = B_1 ⊕ B_2, Y = B_2 ⊕ B_3, Z = B_1 ⊕ B_3$, where $⊕$ indicates the XOR operator.

Does someone know how to prove if $X, Y,$ and $Z$ are pairwise independent? How about mutually independent?

Best Answer

To intuit what is happening, interpret the Bernoulli$(1/2)$ results as coin flips, and so an XOR result is the event of getting exactly one head on a flip of two coins.

$$\mathsf P(X) = \mathsf P(B_1\oplus B_2) = \tfrac 1 2$$

And likewise for the rest.

$X$ and $Y$ have one coin in common, so the intersection of these events is that of either getting a head on only both uncommon coins (and a tail on the common one) xor of getting a head only on the common coin.

$$(B_1\oplus B_2)\cap (B_2\oplus B_3) = (B_1\cap B_3\cap B_2^\complement)\oplus (B_2\cap B_1^\complement\cap B_3^\complement)$$

$$\therefore \mathsf P(X\cap Y) = ?$$

Thus we have pairwise independence on $X$ and $Y$ if $\mathsf P(X\cap Y) = \mathsf P(X)~\mathsf P(Y)$.   Does it?

By symmetry the same holds for $(X, Z)$ and $(Y, Z)$.


Now, what about mutually independence?