[Math] Fully independent events and their complements

probability

Suppose events $A_1, …, A_n$ are fully independent, i.e., $P(A_1 \cap … \cap A_k) = P(A_1)…P(A_k)$ for all $k$ between 2 and $n$. Does this mean that the complementary events are also fully independent: $P(A_1^c \cap … \cap A_k^c) = P(A_1^c)…P(A_k^c)$ for all k?

I know this holds if $k = 2$, but I want to know in general.

I've tried to prove it by induction but it looks like hard work…

Best Answer

We show that under the non-standard definition of fully independent events given in the post, the desired result is not true. We then give a standard definition of fully independent events, and show that under this definition the desired result is true.

A counterexample: We toss a fair coin. Assume that the possible events are $A_1$, the coin rolls around forever (probability $0$), $A_2$, we get a head (probability $1/2$) and $A_3$, we get a tail (probability $1/2$). It is easy to verify that under the definition of fully independent given in the post, the sequence $A_1, A_2, A_3$ is fully independent. But $A_2^c$ and $A_3^c$ are not independent, for $P(A_2^c\cap A_3^c)=0$, but $P(A_2^c)P(A_3^c)=1/4$. It is also easy to verify that the sequence $A_1^c, A_2^c, A_3^c$ is not fully independent.

A proof: We first give a standard definition of full independence. The events $A_1,A_2,\dots, A_n$ are fully independent if, whenever $B_1, B_2, \dots B_k$ are distinct $A_i$, $$P(B_1\cap B_2 \cap \cdots \cap B_k)=P(B_1)P(B_2)\cdots P(B_k).$$ We show that if $A_1, A_2, \dots, A_n$ are fully independent, then so are $A_1^c,A_2^c,\dots, A_n^c$.

There is a not difficult proof by induction. However, we prefer to avoid formal induction, in order to get a proof that has more symmetry. We need to prove that if $B_1, B_2, \dots B_k$ are distinct $A_i$, then $$P(B_1^c\cap B_2^c \cap \cdots \cap B_k^c)=P(B_1^c)P(B_2^c)\cdots P(B_k^c).$$

To save space, let $b_i=P(B_i)$. So we want to prove that $$P(B_1^c\cap B_2^c \cap \cdots \cap B_k^c)=(1-b_1)(1-b_2)\cdots (1-b_k).$$

Let $p$ be the probability on the left. Then $$1-p=P(B_1\cup B_2 \cup \cdots \cup B_k).$$ Thus, by the Principle of Inclusion/Exclusion, $$1-p=\sum_{i=1}^k b_i -\sum_{1 \le i <j}b_ib_j+\sum_{1 \le i <j<k}b_ib_jb_k-\cdots$$ and therefore $$p=1 -\sum_{i=1}^k b_i +\sum_{1 \le i <j}b_ib_j-\sum_{1 \le i <j<k}b_ib_jb_k+\cdots.$$ The right-hand side is just $(1-b_1)(1-b_2)\cdots (1-b_k)$. This completes the proof.

Related Question