I am trying to explain in simple words the Causal Markov condition to establish probabilistic causation. The definition from Hausman and Woodward (1999) is the following:
Let G be a causal graph with vertex set V and P be a probability distribution
over the vertices in V generated by the causal structure represented by G. G
and P satisfy the Causal Markov Condition if and only if for every X in V, X
is independent of V\(Descendants(X) ∪ Parents(X)) given Parents(X)
My explanation is that a Causal Markov condition is satisfied if the set of variables in a causal relationship with given probability distributions are independent of all the other variables unless they are their parents or their descendants. This is slightly different than other definitions around so, is my explanation first, correct?, second clear?
Any advice will be appreciated.
Best Answer
One way to think about the Causal Markov Condition (CMC) is giving a rule for "screening off": once you know the values of $X$'s parents, all other variables in $V$ become irrelevant for predicting $X$, except for $X$'s descendants.
I find examples make the CMC easiest to understand. I did a quick google image search for "mechanism of cardiovascular disease" so I can give you a medical example. Take this graph (let's call it $G$):
Say you have a probability distribution $P$ over the variables in $G$. If the CMC holds in $P$ (relative to $G$), then you can infer that:
However, the CMC allows the following possibilities: