Can de Morgan’s laws be applied to probability

boolean-algebraprobability

I know and understand what de Morgan's laws are in the context of Boolean algebra and set theory, but can they be applied to probability? For example, when calculating the probability of multiple things not happening, would it be reasonable to say, given de Morgan's laws, that:

$$ \prod_{n=1}^{k}\left(1-P\left(n\right)\right)=1-\sum_{n=1}^{k}\left(P\left(n\right)\right) $$

With the first representing (for lack of a better phrase) ¬a ∧ ¬b, and the second representing ¬(a ∨ b).

Best Answer

De Morgan's laws, in set theory, say that for two sets $A$ and $B$, the complement of $A\cap B$ is $\bar{A}\cup \bar{B}$ (where $\bar{\cdot}$ denotes the complement).

This directly translates into probability, because it tells you that

$1 - P(A\cap B) = P(\bar{A}\cup\bar{B})$.

Now, you cannot directly decompose the right-hand side as $P(\bar{A}) + P(\bar{B}) = (1 - P(A)) + (1 - P(B))$, because that requires $\bar{A}$ and $\bar{B}$ to be disjoint events. Same thing for the left-hand side, which you can't always write as $(1-P(A))(1-P(B))$.

In general, for events $E$ and $F$, $P(E\cup F) = P(E) + P(F) - P(E \cap F)$. (because you don't want to count the overlap $E\cap F$ twice).

Also, all of this is probability, not statistics. Though this is more language.

Probability cares with the theory: assigning probabilities to events that can happen, etc... while statistics deals with data, and how you treat/interpret it.

Related Question