Conditional Independence – Why Conditional Independence Isn’t Guaranteed When Specifying Marginal Distributions

distributionsgraphical-modelindependence

It was mentioned very briefly in a lecture related to graphical models that two random variables $X_3$ and $X_4$ are both dependent on $X_2$. But even when conditioned on $X_2$, the two variables $X_3$ and $X_4$ could be dependent through some other means.

I wasn't sure what the person meant by this, below is an example provided:

Random variables: $X_1$, $X_2$, $X_3$, $X_4$ with the following density functions.

$X_1 \sim \mathrm{Bernoulli}(1/2)$
$X_2 \sim \mathrm{Bernoulli}(1/2)$
$X_3 \mid (X_1,X_2) \sim \mathcal N(X_1+X_2,\sigma^2)$
$X_4 \mid X_2 \sim \mathcal N(aX_2+b,1)$

The above specification doesn't necessarily mean that:

$p(x_3,x_4|x_2)=p(x_3|x_2)p(x_4|x_2)$

Meaning the conditional independence does not necessarily hold between $X_3$ and $X_4$ given $X_2$. The graphical model shows that the conditional independence will hold, but the specification I provided above does not guarantee it.

Best Answer

This question and the OP's lecturer's claims seem to indicate misunderstanding of the notions of independence and conditional independence of random variables. Different sets of distributions for Bernoulli random variables $X$, $Y$, and $Z$ are presented here to illustrate the differences between various notions.

  • Suppose that $X$ and $Y$ are known to be independent Bernoulli random variables with parameter $\frac{1}{2}$. Thus, their probability mass functions (pmf) are $$p_X(0) = p_X(1) = \frac{1}{2}; ~ p_Y(0) = p_Y(1) = \frac{1}{2}$$ and their joint pmf is the product of the (marginal) pmfs $$p_{X,Y}(i,j) = p_X(i)p_Y(j) = \frac{1}{2}\times\frac{1}{2} = \frac{1}{4} ~\text{for all} ~ i, j \in \{0, 1\}$$ Are $X$ and $Y$ necessarily conditionally independent given $Z$ where $Z$ is also a Bernoulli random variable with parameter $\frac{1}{2}$? Not necessarily. Suppose that $Z$ has parameter $\frac{1}{2}$ and consider the probability distributions $$\begin{align*}p_{X,Y\mid Z}(i,j\mid Z=0) &= \begin{cases}\frac{1}{2}, & i = j\\ 0, & i \neq j, \end{cases}\\ p_{X,Y\mid Z}(i,j\mid Z=1) &= \begin{cases}\frac{1}{2}, & i \neq j\\ 0, & i = j, \end{cases} \end{align*} $$ The law of total probability shows that these conditional distributions combine to give the known joint pmf of $X$ and $Y$. It is also easy to verify that regardless of whether $Z = 0$ or $Z = 1$, both $X$ and $Y$ are conditionally distributed as Bernoulli random variables with parameter $\frac{1}{2}$, but $X$ and $Y$ are not conditionally independent given $Z$, regardless of whether $Z$ has value $0$ or $1$. In one case, we have $X = Y$, and in the other, $X = 1-Y$. Thus we can say the following

If $X$ and $Y$ are independent random variables with known marginal distributions, then their joint distribution is the product of the marginal distributions. However, $X$ and $Y$ need not be conditionally independent even if the conditional marginal distributions of $X$ and $Y$ are the same as their given unconditional marginal distributions. Thus the conditional joint distribution of unconditionally independent random variables need not be the product of the conditional marginal distributions.

  • Suppose that $X$ and $Y$ are conditionally independent given $Z = 0$ and also conditionally independent given $Z = 1$. Are $X$ and $Y$ necessarily unconditionally independent? Not necessarily, not even if $Z$ is a Bernoulli random variable with parameter $\frac{1}{2}$. Suppose that $X$ and $Y$ are conditionally independent Bernoulli random variables with parameter $p$ if $Z = 0$ and parameter $q$ if $Z = 1$. Thus, the conditional joint pmfs are $$\begin{align*} p_{X,Y\mid Z}(0,0\mid Z = 0) &= (1-p)^2; \qquad \quad p_{X,Y\mid Z}(0,0\mid Z = 1) = (1-q)^2;\\ p_{X,Y\mid Z}(0,1\mid Z = 0) &= p(1-p); \qquad \quad p_{X,Y\mid Z}(0,1\mid Z = 1) = q(1-q);\\ p_{X,Y\mid Z}(1,0\mid Z = 0) &= p(1-p); \qquad \quad p_{X,Y\mid Z}(1,0\mid Z = 1) = q(1-q);\\ p_{X,Y\mid Z}(1,1\mid Z = 0) &= p^2; \qquad \quad \, \qquad p_{X,Y\mid Z}(1,1\mid Z = 1) = q^2; \end{align*}$$ Suppose that $p \neq q$. Then, $X$ and $Y$ are unconditionally independent only in the trivial cases when $Z$ has parameter $\lambda$ equal to $0$ or $1$ (when one of the above two joint pmfs has weight $0$ in the total probability formula.

If $X$ and $Y$ are conditionally independent given $Z$, they need not be unconditionally independent.

Is there any instance where conditional independence guarantees unconditional independence? If $X$ and $Y$ are not only conditionally independent given $Z$ but also have the same conditional joint distribution for all choices of $Z$ then $X$ and $Y$ are unconditionally independent. But this is also a trivial special case because the necessary condition means that $X$, $Y$, and $Z$ are mutually independent random variables, and so the conditional joint distribution of $X$ and $Y$ does not depend on the value of $Z$.

Related Question