Joint behavior of independent Bernoulli random variables

statistics

I have three Bernoulli random variables $X$, $Y$ and $Z$ with success probability $\frac{1}{2}$, and assume $X \perp \!\!\! \perp Y \perp \!\!\! \perp Z$ (so pairwise independent).

I have to show that
$$
W=XZ+(1-X)(1-Y),
$$

is also a Bernoulli variable with succes probability $\frac{1}{2}$.


I know how to find the joint distribution of two random variables, however, I'm unsure when we have three random variables, plus we both have multiplication and addition of them at the same time.

Any help will be greatly appreciated.

Best Answer

Note that $W$ is either $0$ or $1$. To see so, note that $XZ$ is either $0$ or $1$, and so is $(1-X)(1-Y)$. However, both cannot be equal to $1$ at the same time, due to $X$ is either $0$ (then $XZ$ is $0$ ), or $1$ (and then $(1-X)(1-Y) = 0$ ).

So, we know that for some $p \in [0,1]$, we have $\mathbb P(W=1) = p = 1 - \mathbb P(W=0)$

Note that $p = \mathbb E[W] = \mathbb E[XZ + (1-X)(1-Y)] = \mathbb E[XZ] + \mathbb E[(1-X)(1-Y)] = \mathbb E[X] \mathbb E[Z] + \mathbb E[(1-X)] \mathbb E[(1-Y)] = \frac{1}{4} + \frac{1}{4} = \frac{1}{2}$

Equalities due to linearity of expectations and independence of $X,Y$ and $X,Z$.

We have $\mathbb P(W=1) = \frac{1}{2} = \mathbb P(W=0)$, which is exactly what we needed to prove

Related Question