Solved – For which distributions does uncorrelatedness imply independence

correlationdistributionsindependencemathematical-statisticsprobability

A time-honored reminder in statistics is "uncorrelatedness does not imply independence". Usually this reminder is supplemented with the psychologically soothing (and scientifically correct) statement "when, nevertheless the two variables are jointly normally distributed, then uncorrelatedness does imply independence".

I can increase the count of happy exceptions from one to two: when two variables are Bernoulli-distributed , then again, uncorrelatedness implies independence. If $X$ and $Y$ are two Bermoulli rv's, $X \sim B(q_x),\; Y \sim B(q_y)$, for which we have $P(X=1) = E(X) = q_x$, and analogously for $Y$, their covariance is

$$\operatorname{Cov}(X,Y)= E(XY) – E(X)E(Y) = \sum_{S_{XY}}p(x,y)xy – q_xq_y $$

$$ = P(X=1,Y=1) – q_xq_y = P(X=1\mid Y=1)P(Y=1)-q_xq_y$$

$$= \Big(P(X=1\mid Y=1)-q_x\Big)q_y $$

For uncorrelatedness we require the covariance to be zero so

$$\operatorname{Cov}(X,Y) = 0 \Rightarrow P(X=1\mid Y=1) = P(X=1)$$

$$\Rightarrow P(X=1,Y=1) = P(X=1)P(Y=1) $$

which is the condition that is also needed for the variables to be independent.

So my question is: Do you know of any other distributions (continuous or discrete) for which uncorrelatedness implies independence?

Meaning: Assume two random variables $X,Y$ which have marginal distributions that belong to the same distribution (perhaps with different values for the distribution parameters involved), but let's say with the same support eg. two exponentials, two triangulars, etc. Does all solutions to the equation $\operatorname{Cov}(X,Y) = 0$ are such that they also imply independence, by virtue of the form/properties of the distribution functions involved? This is the case with the Normal marginals (given also that they have a bivariate normal distribution), as well as with the Bernoulli marginals -are there any other cases?

The motivation here is that it is usually easier to check whether covariance is zero, compared to check whether independence holds. So if, given the theoretical distribution, by checking covariance you are also checking independence (as is the case with the Bernoulli or normal case), then this would be a useful thing to know.
If we are given two samples from two r.v's that have normal marginals, we know that if we can statistically conclude from the samples that their covariance is zero, we can also say that they are independent (but only because they have normal marginals). It would be useful to know whether we could conclude likewise in cases where the two rv's had marginals that belonged to some other distribution.

Best Answer

"Nevertheless if the two variables are normally distributed, then uncorrelatedness does imply independence" is a very common fallacy.

That only applies if they are jointly normally distributed.

The counterexample I have seen most often is normal $X \sim N(0,1)$ and independent Rademacher $Y$ (so it is 1 or -1 with probability 0.5 each); then $Z=XY$ is also normal (clear from considering its distribution function), $\operatorname{Cov}(X,Z)=0$ (the problem here is to show $\mathbb{E}(XZ)=0$ e.g. by iterating expectation on $Y$, and noting that $XZ$ is $X^2$ or $-X^2$ with probability 0.5 each) and it is clear the variables are dependent (e.g. if I know $X>2$ then either $Z>2$ or $Z<-2$, so information about $X$ gives me information about $Z$).

It's also worth bearing in mind that marginal distributions do not uniquely determine joint distribution. Take any two real RVs $X$ and $Y$ with marginal CDFs $F_X(x)$ and $G_Y(y)$. Then for any $\alpha<1$ the function:

$$H_{X,Y}(x,y)=F_X(x)G_Y(y)\left(1+\alpha\big(1-F_X(x)\big)\big(1-F_Y(y)\big)\right)$$

will be a bivariate CDF. (To obtain the marginal $F_X(x)$ from $H_{X,Y}(x,y)$ take the limit as $y$ goes to infinity, where $F_Y(y)=1$. Vice-versa for $Y$.) Clearly by selecting different values of $\alpha$ you can obtain different joint distributions!