Characterization of joint probability density function of independent random variables

probability distributions

Let $f(x,y)$ be a joint probability density function (pdf) of two random variables $X$ and $Y$. To check whether $X$ and $Y$ are independent, we can compute the marginal densities and check if their product equals $f(x,y)$.

My question is: Is there a characterization of functions that are pdf of independent random variables, i.e., can we "easily" decide whether $X$ and $Y$ are independent without determining the marginal density functions?

If this is not the case, is a characterization known when we restrict $f(x,y)$ so simple functions, say, polynomials?

Best Answer

As Kabo Murphy points out, the test is whether $f$ factors as $f(x,y)=g(x)h(y)$.

A test for that is that, for almost all $x_1,x_2,y_1,y_2$, the identity $$ f(x_1,y_1)f(x_2,y_2)=f(x_1,y_2)f(x_2,y_1)\tag{*}$$ should hold.

This might be useful if you have a very complicated analytic expression for $f(x,y)$, for which the factorization is not visible but (*) can be checked algebraically or numerically.

A caveat: density functions are only defined "almost everywhere": modify a density function for a measure-$0$ set of argument values and it counts as the same density function. So (*) must hold for almost all values of the $x_i$ and $y_j$.

Added later. Another test is that $$\frac \partial{\partial x} \frac \partial{\partial y} \log f(x,y)$$ must vanish for all $x,y$.

Related Question