[Math] How to show that these random variables are pairwise independent

probabilityprobability distributionsprobability theory

Given the arrays $C=[C_1,C_2,…,C_N]$ and $S=[S_1,S_2,…,S_N]$ of lengths $N$ with elements that are discrete iid uniform distributed with equal probability (p=1/2) of being $\pm$ 1

Consider the random variables (for a given $l, n, m$):

$W=C_lC_mC_n$

$X=S_lS_mC_n$

$Y=C_lS_mS_n$

$Z=S_lC_mS_n$

It can be shown that these random variables ($W, X, Y, Z$) are zero mean, uniform distributed with equal probability (p=1/2) of being $\pm$ 1. Furthermore, it can be shown that they are uncorrelated (e.g. $E[WX]=E[(C_lC_mC_n)(S_lS_mC_n)]=0$ since they are zero mean ($E[C_i]=E[S_i]=0$) and $C_i^2=S_i^2=1$ ).

Now how can one go about showing that the random variables ($W, X, Y, Z$) are pairwise independent? (The expectation $E[WX]=E[(C_lC_mC_n)(S_lS_mC_n)]=0=E[W]E[X]=0$ but that doesn't necessarily mean they are independent). So I'm wondering what is a good way to show independence, any tricks one can use?

Best Answer

Consider two Bernoulli variables (with values in $\{0,1\}$) $X,Y$. We have

$$Cov(X,Y) = E(X,Y) - E(X) E(Y) = P(X=1,Y=1) - P(X=1) P(Y=1)$$

Then, $Cov(X,Y) = 0$ implies $P(X=1,Y=1) = P(X=1) P(Y=1)$ which further implies $P(X,Y) = P(X)P(Y)$. That is, for Bernoulli variables non-correlation implies independence.

Now, your $X,Y,Z,W$ are Bernoulli variables (except for a scaling-shifting that doesn't alter covariance/independence) pairwise uncorrelated, hence they are pairwise independent.

Related Question