Conditional Probability Analysis – Why Marginal Independence Does Not Imply Joint Independence

conditional probabilityconditional-expectationindependencerandom variable

Consider the random variables $X,Y,Z$.

Assume

$$
X\perp Z
$$

$$
Y\perp Z
$$

$$
X\perp Y
$$

Can we say
$$
(X,Y)\perp Z
$$

My intuition is that we can't because
$$
p_{X,Y}=p_X\times p_Y=p_{X|Z} \times p_{Y|Z} \neq p_{X,Y|Z}
$$

Could you provide also some example?


Viceversa, suppose for some random variables $Q,A,B$ we have that
$$
Q\perp(A,B)
$$
Does this imply $Q\perp A$ and $Q\perp B$?

Best Answer

Your first question assumes that $X, Y$, and $Z$ are pairwise independent random variables and asks whether $$ X,Y, Z~~\text{pairwise independent} \implies (X,Y)~~\text{and}~~Z~~\text{independent}??\tag{1} $$ In general, the answer is NO, the implication $(1)$ does not hold in all cases. The proof of this is by a reductio ad absurdum argument: if the implication $(1)$ were to hold, then pairwise independence of $X$, $Y$, and $Z$ would imply that $X$, $Y$, and $Z$ are mutually independent which we know is not true in general.

Suppose that implication $(1)$ is true. Then, $$p_{X,Y,Z} = \underbrace{p_{X,Y}\cdot p_Z}_{(X,Y)\perp Z ~\text{via}~(1)} = \underbrace{p_{X}\cdot p_Y}_{X\perp Y~\text{by hypothesis}}\cdot p_Z$$ and so $X$, $Y$, and $Z$ are mutually independent which is not true in general. So, it must be that the hypothesis that $(1)$ holds is false.

The simplest example of pairwise independent random variables that are not mutually independent random variables is $X, Y$, and $Z$ being Bernoulli$(\frac 12)$ random variables whose joint pmf has value $\frac 14$ when $(X,Y,Z)$ equals $(0,0,0)$ or $(0,1,1)$ or $(1,0,1)$, or $(1,1,0)$. These are pairwise independent random variables but not mutually independent random variables. In particular, $$p_{X,Y,Z}(1,1,1) = 0 \neq p_{X,Y}(1,1)\cdot p_Z(1) = \frac 14\cdot \frac 12$$ and so $(X,Y)$ is not independent of $Z$ as you suspect. Indeed, it is even possible to have standard normal random variables that are pairwise independent but are not mutually independent. See the first couple of paragraphs of this answer of mine for a construction.

Your second question asks independence of $(X,Y)$ and $Z$ allows us to deduce that $X$ and $Z$ are independent and that $Y$ and $Z$ are independent, and here the answer is YES. We have that for discrete random variables, $$p_{X,Z} = \sum_y p_{X,Y,Z} = \sum_y \underbrace{p_{X,Y}\cdot p_Z}_{(X,Y)\perp Z} = \left(\sum_y p_{X,Y}\right)\cdot p_Z = p_X\cdot p_Z$$ and similarly for $Y$ and $Z$, (and for continuous random variables when sums are replaced by integrals, etc).

Related Question