[Math] Covariance zero for two gaussian variables

probabilitystatistics

Say we have two random variables $X$ and $Y$ and both of them have a gaussian distribution. Further, we know that $cov(X,Y) = 0$, where $cov(X,Y)$ is the covariance of two variables (i.e $cov(X,Y) = E[(X-E[X])(Y-E[Y])]$, where $E[X]$ is the mean (expectation) of variable $X$.

Can we say that $X$ and $Y$ are independent variables? I know that, in general, $cov(X,Y)= 0 $ does not imply that $X$ and $Y$ are independent, but what about the case when $X$ and $Y$ have a gaussian(normal) distribution? Can we take this as a theorem?

Best Answer

For jointly (per @Did) normal random variables, uncorrelated implies independent. In particular, it is easy to see that the joint density function factors, giving the product of the two marginal density functions.

Also, for normal data, the sample mean $\bar X$ and sample SD $S$ are independent. (Proof via linear algebra or moment generating functions.) But $\bar X$ and $S$ are not independent except for normal data.

In the left panel below $S$ is plotted against $\bar X$ for 30,000 randomly generated standard normal datasets of size $n = 5.$ As a 'naturally occurring' instance where zero correlation and dependence coexist: in the right panel the same is done for 30,000 samples of size $n = 5$ from $Beta(.5, .5).$ For these beta data $\bar X$ and $S$ are uncorrelated, but not independent.

m = 30000;  n = 5
x = rnorm(m*n); NRM = matrix(x, nrow=m)
ax = rowMeans(NRM); sx = apply(NRM, 1, sd)
cor(ax, sx)
## -0.001177232  # consistent with uncorrelated
y = rbeta(m*n, .5, .5);  BTA = matrix(y, nrow=m)
ay = rowMeans(BTA);  sy = apply(BTA, 1, sd)
cor(ay, sy)
## -0.001677063  # consistent with uncorrelated

enter image description here

Related Question