Is $X \perp\!\!\!\perp Y$ a conditional independence, arguing that the independence is conditioned on an empty set of random variables? If so, does that mean that an unconditional independence is always a conditional independence (but the converse is false)?
Solved – Conditional independence: conditioning on an empty set of random variables
conditional probabilityconditional-independenceindependencemathematical-statisticsterminology
Related Solutions
This question and the OP's lecturer's claims seem to indicate misunderstanding of the notions of independence and conditional independence of random variables. Different sets of distributions for Bernoulli random variables $X$, $Y$, and $Z$ are presented here to illustrate the differences between various notions.
- Suppose that $X$ and $Y$ are known to be independent Bernoulli random variables with parameter $\frac{1}{2}$. Thus, their probability mass functions (pmf) are $$p_X(0) = p_X(1) = \frac{1}{2}; ~ p_Y(0) = p_Y(1) = \frac{1}{2}$$ and their joint pmf is the product of the (marginal) pmfs $$p_{X,Y}(i,j) = p_X(i)p_Y(j) = \frac{1}{2}\times\frac{1}{2} = \frac{1}{4} ~\text{for all} ~ i, j \in \{0, 1\}$$ Are $X$ and $Y$ necessarily conditionally independent given $Z$ where $Z$ is also a Bernoulli random variable with parameter $\frac{1}{2}$? Not necessarily. Suppose that $Z$ has parameter $\frac{1}{2}$ and consider the probability distributions $$\begin{align*}p_{X,Y\mid Z}(i,j\mid Z=0) &= \begin{cases}\frac{1}{2}, & i = j\\ 0, & i \neq j, \end{cases}\\ p_{X,Y\mid Z}(i,j\mid Z=1) &= \begin{cases}\frac{1}{2}, & i \neq j\\ 0, & i = j, \end{cases} \end{align*} $$ The law of total probability shows that these conditional distributions combine to give the known joint pmf of $X$ and $Y$. It is also easy to verify that regardless of whether $Z = 0$ or $Z = 1$, both $X$ and $Y$ are conditionally distributed as Bernoulli random variables with parameter $\frac{1}{2}$, but $X$ and $Y$ are not conditionally independent given $Z$, regardless of whether $Z$ has value $0$ or $1$. In one case, we have $X = Y$, and in the other, $X = 1-Y$. Thus we can say the following
If $X$ and $Y$ are independent random variables with known marginal distributions, then their joint distribution is the product of the marginal distributions. However, $X$ and $Y$ need not be conditionally independent even if the conditional marginal distributions of $X$ and $Y$ are the same as their given unconditional marginal distributions. Thus the conditional joint distribution of unconditionally independent random variables need not be the product of the conditional marginal distributions.
- Suppose that $X$ and $Y$ are conditionally independent given $Z = 0$ and also conditionally independent given $Z = 1$. Are $X$ and $Y$ necessarily unconditionally independent? Not necessarily, not even if $Z$ is a Bernoulli random variable with parameter $\frac{1}{2}$. Suppose that $X$ and $Y$ are conditionally independent Bernoulli random variables with parameter $p$ if $Z = 0$ and parameter $q$ if $Z = 1$. Thus, the conditional joint pmfs are $$\begin{align*} p_{X,Y\mid Z}(0,0\mid Z = 0) &= (1-p)^2; \qquad \quad p_{X,Y\mid Z}(0,0\mid Z = 1) = (1-q)^2;\\ p_{X,Y\mid Z}(0,1\mid Z = 0) &= p(1-p); \qquad \quad p_{X,Y\mid Z}(0,1\mid Z = 1) = q(1-q);\\ p_{X,Y\mid Z}(1,0\mid Z = 0) &= p(1-p); \qquad \quad p_{X,Y\mid Z}(1,0\mid Z = 1) = q(1-q);\\ p_{X,Y\mid Z}(1,1\mid Z = 0) &= p^2; \qquad \quad \, \qquad p_{X,Y\mid Z}(1,1\mid Z = 1) = q^2; \end{align*}$$ Suppose that $p \neq q$. Then, $X$ and $Y$ are unconditionally independent only in the trivial cases when $Z$ has parameter $\lambda$ equal to $0$ or $1$ (when one of the above two joint pmfs has weight $0$ in the total probability formula.
If $X$ and $Y$ are conditionally independent given $Z$, they need not be unconditionally independent.
Is there any instance where conditional independence guarantees unconditional independence? If $X$ and $Y$ are not only conditionally independent given $Z$ but also have the same conditional joint distribution for all choices of $Z$ then $X$ and $Y$ are unconditionally independent. But this is also a trivial special case because the necessary condition means that $X$, $Y$, and $Z$ are mutually independent random variables, and so the conditional joint distribution of $X$ and $Y$ does not depend on the value of $Z$.
No.
Consider three boolean variables: A, B, X where X and A are i.i.d. Bernoulli with probabilty 0.5, while B = X $\oplus$ A (that is, B is equal to the xor of X and A).
It's easy to show that B is also Bernoulli distributed with probabilty 0.5, and A and B are mutually independent, though obviously they aren't conditionally independent given X.
Best Answer
Yes, I would say, $X \perp\!\!\!\perp Y$ can be thought of as conditional independence, but conditioning on the full sample space $\Omega$. The probability of the empty set impossible event is zero, and you cannot really condition on an event with probability zero.
Conditioning on the full sample space really is saying that "the only fact that I know is that some outcome in the sample space happened", as it must if your model is any good.