How can we show joint PMF of two random variable are independent

probabilityprobability theoryrandom variables

A discrete random vector $(X,Y)$ has pmf given by
$p_{X,Y}(1,−1) = p_{X,Y}(−1,1) = \frac{1}{2}$.
(a) Check whether the random variables $X$ and $Y$ are independent.

In this question as the $X$ and $Y$ are only taking the value $1$ and $-1$ so if we can show that for all four scenario pair $\left\{(X=1, Y = 1), (X= 1, Y= -1),(X=-1,y =-1) and (X=-1,Y = 1)\right\}$ the joint PMF of the random variable $p_{X,Y}(x,y) = p_{X}(x)p_{Y}(y)$ then it can be shown as independent. But Shall I take the each value and check for all four cases or is there any smart way to that?

Best Answer

it is easy to show that $X$ and $Y$ are not independent. This because using the given data you get that

$$P_{XY}(-1;-1)=P_{XY}(1;1)=0$$

these are 2 sufficient counterexamples against independence


In fact, using the given data you can derive the following contingency table

enter image description here

where you can observe that, for both marginals, you have $P(1)=P(-1)=0.5$ but, as an example, if $X=-1$, the rv Y can take only the value 1, being $P(Y=1|X=-1)=1$

This contraddicts the definition of independence and the probability of Y depends about X occurrence

Related Question