Independence of the linear combinations of normal random variables

normal distributionprobabilityprobability theoryrandom variablesstatistical-inference

Are linear combinations of independent normally distributed random variables also independent?
Suppose we have two independent standard normal variables $X\sim N(0,1)$, $Y\sim N(0,1)$. Let $U=X+Y$ and $V=X-Y$. Clearly $U$ and $V$ are also normally distributed: $U\sim N(0,2)$, $V\sim N(0,2)$.

We can show that their covariance is zero: $E\{UV\}=E\{(X+Y)(X-Y)\}=E\{X^2\}-E\{Y^2\}=0.$ But are they also independent? Is there a quick way to check it?

Obviously if we chose coefficients $U=aX+bY$, $V=cX+dY$ to be $a=b=c=d=1$, these random variables are dependent and for some arbitrary variances, expectations and coefficients covariance might not be zero. Feels like this has some connection to linear algebra and vectors formed by these coefficients $(a,b), (c,d) but i don’t have enough qualification to elaborate on that. Is there some general result on the linear combinations of normal random variables and their covariance and dependence?

Best Answer

Yes, for jointly normal random variables, zero covariance is necessary and sufficient for independence. For the variables separately, and also in blocks: if $(X_1,\ldots,X_m)$ and $(Y_1,\ldots,Y_n)$ are jointly normal, and if all the $\text{Cov}(X_i,Y_j)$ vanish, the the collection of $X_i$ is independent of the collection of $Y_j$.

And your intuition that linear algebra has a lot to do with this is of course correct. Basically, all the action is in covariance matrices; these represent quadratic forms, and off you go. There is an old textbook Elements of continuous multivariate analysis by Arthur Dempster that explores this connection in depth, and I'm sure there are more recent books like that, too.