Are two orthogonal linear transformations of the same random Gaussian vector independent

independenceorthogonalityprobability

Major revision:

I want to prove/disprove the following claim:

If $e = (e_{1},\dots, e_{n})$ is a vector of independent random variables, (each $e_{i}$ is normally distributed), and $v_{1}, \dots v_{n} \in N^{n}$ are n orthogonal vectors, then $\langle v_{1}, e \rangle, \dots, \langle v_{n}, e \rangle$ are independent.

And I'm a but puzzled. On one hand, it seems like this question supports a positive answer to this claim. The covariance matrix $C$ of $e$ is scalar, and the vectors $v_{1},\dots, v_{n}$ define a matrix A such that $A\cdot A^{T}$ is diagonal, so according to the explanation there $ACA^{T}$ is diagonal, and independency should follow.

On the other hand, this claim fails hard on toy example, when $e$'s values are drawn from discrete uniform distribution. For example, take $e = (e_{1},e_{2})$ to be vector of two uniform variables between 0 and p, and $v_{1} = (1, 1), v_{2} = (1, -1)$. Clearly if $\langle v_{1}, e \rangle = 2p$ then $\langle v_{2}, e \rangle = 0$.

What am I missing? I can't see the problem with the proof regarding normal distribution, nor find a reason that the same proof wouldn't work on the uniform distribution (where its obviously a false claim due to my counterexample)

Thanks!

Best Answer

Warning: this answer is wrong, see correction in the "Update" below


In general: if ${\bf x}=(x_1,\cdots x_n)'$ is uncorrelated (meaning that the elements $x_i$ are pairwise uncorrelated), and if $A$ is a $n \times n$ orthogonal matrix, then it's easy to prove that the variable

$${\bf y} = A {\bf x}$$

is also uncorrelated. That is, if $C_{\bf x}$ (covariance matrix of ${\bf x}$) is diagonal, so is $C_{\bf y}$.

Now, if additionally $x_i$ are gaussian variables, then the components of ${\bf x}$ are not only uncorrelated but independent. The result above implies that the variable ${\bf y} = A {\bf x}$ is uncorrelated; but, because it's also joinly gaussian, then it's also independent.

Hence your claim is correct.

Now if instead $x_i$ are just independent (hence uncorrelated) but not gaussian, then all we can say about ${\bf y} = A {\bf x}$ is that $y_i$ are uncorrelated. They are not, in general, independent. As your example illustrates.


Update: this answer, as is, is blatantly wrong. To make it right, one needs an essential additional assumption: that the variables have the same variance, that is, that $C_x$ is not only diagonal but also constant along the diagonal.

Without this assumption, it's false that $A {\bf x}$ preserves uncorrelatedness. This can be easily seen in the case of two uncorrelated jointly Gaussian (and hence independent) of different variances: the orthogonal transformation amounts to a rotation, which leads to ellipses (level curves of the densitiy function) which are no longer aligned with the axis.

Incidentally, this also applies to the example in the question: if $X$ $Y$ are independent uniform, but with different ranges, then $X+Y$, $X-Y$ are correlated.