Suppose that I have vector $\mathbf{x}$ that contains $n$ independently and identically distributed (i.i.d.) zero-mean Gaussian random variables $x_i\sim\mathcal{N}(0,\sigma^2)$.
Also suppose I have a uniform random rotation defined by matrix $\mathbf{R}$ that changes the angle of $\mathbf{x}$ with respect to each basis vector in the space $\mathbb{R}^n$ by some amount drawn uniformly at random from $[0,2\pi]$.
I am interested in the conditional distribution of $\mathbf{y}=\mathbf{Rx}$ given $\mathbf{x}$. It seems to me that $\mathbf{y}$ should contain i.i.d. zero-mean Gaussian random variables $y_i\sim\mathcal{N}(0,\sigma^2)$. Is that true? If so, how does one prove it?
I know that a vector of i.i.d. Gaussians is invariant to the rotation. I also know that any given rotation is a linear transformation, so if we know $\mathbf{R}$, then, obviously, $\mathbf{y}$ given $\mathbf{x}$ is deterministic and not random. I am wondering about the case when $\mathbf{R}$ is random. This question arises out of a study of a very strange interference channel in information theory. I appreciate any hints.
Best Answer
If R is uniform, then, conditionally on x, y=Rx is uniformly distributed on the sphere centered at 0 with radius |x|.