[Math] Show that two definition of Independent Random Variables are Equivalent

independenceprobabilityprobability distributions

I want to show that X and Y, two random variables, are independent iff $P(X=x, Y=y)=P(X=x)P(Y=y)$

The definition of independent random variables is $P(X\leq x, Y \leq y)=P(X\leq x)P(Y\leq y)$

I've already done this for the continuous random variable case but now discrete is causing my problems. I showed that $P(X=x, Y=y)=P(X=x)P(Y=y)$ implies independence by taking the double sum for all values less than x and y respectively to get the CDFs.

I have problems going the other way though. I've tried taking the limits on both sides of the equations for the CDFs to obtain the pmfs, but the equation doesn't work out nicely I get too many values on the right hand side.

Any help is appreciated, thanks

Best Answer

Suppose $X$ and $Y$ are independent. Fix some $x$ and $y$.

If the random variables are discrete, then we have \begin{align} P(X \le x, Y \le y) &= P(X \le x) P(Y \le y) \\ P(X \le x, Y < y) &= P(X \le x) P(Y < y) \\ P(X < x, Y \le y) &= P(X < x) P(Y \le y) \\ P(X < x, Y < y) &= P(X < x) P(Y < y) \end{align}

Subtracting the second line from the first gives $P(X \le x, Y=y) = P(X \le x) P(Y=y)$. Subtracting the fourth line from the third gives $P(X<x, Y = y) = P(X<x) P(Y =y)$. Subtracting these two equations gives the desired condition.