Showing the independence of two random centred Gaussian vectors.

independencenormal distributionprobabilityprobability distributions

Let $X = (X_1,\ldots, X_d)$ be a centred Gaussian vector composed of i.i.d random variables. I have two questions. The first one is whether my approaches correct.

  1. I want to show that: $O$ being and orthogonal $d\times d$ matrix, $OX$ has the same law as $X$.

The way I did was by the following:

I say that a general Gaussian vector $X$ has the law $N(\mu_X, \Sigma_X)$. I wanna show that $Y = OX$ has the same law as $X$ which is equal to $O^{-1}X$. this is saying that $P_Y(y) \propto P_X(O^{-1}X)$

(leaving the normalisation constant away)

\begin{align}
&\propto \exp\left[-\frac12(O^{-1}Y – \mu_X)^T \Sigma_X^{-1}(O^{-1}(Y)-\mu_X)\right]
\\&= \exp\left[-\frac12(Y – O\mu_X)^T O^{-T} \Sigma_X^{-1}O^{-1}(Y-O\mu_X)\right]
\\&= \exp\left[-\frac12 (Y-O \mu_X)^T (O\Sigma_X O^T)^{-1}(Y-O\mu_X)\right]\,,
\end{align}

which has the law $N_Y(O\mu_X , O\Sigma_XO^T)$.

Therefore this has the same law as $X$. Is this correct??? And if so what's the last sentence of my argument?

  1. I want to show that when $a=(a_1,\ldots,a_d)$ and $b=(b_1,\ldots,b_d)$ are two orthogonal vectors on $R^n$, Then by considering orthogonal matrix $O$, whose first two columns coincide with $a$ and $b$, show that $\sum_{i=1}^d a_i X_i$ and $\sum_{i=1}^d b_i X_i$ are independent. How is this done???

Best Answer

If the components of $X$ are centered and i.i.d., you can write $X\sim\mathcal{N}\left(0,\sigma^2 I\right)$ where $\sigma^2$ is their common variance. Then, because any linear transformation of a Gaussian vector is Gaussian, $OX$ is also Gaussian with mean $O\cdot 0 = 0$ and covariance matrix $O\left(\sigma^2 I\right)O^\top = \sigma^2 OO^\top = \sigma^2 I$.

As for the second part, you can write \begin{eqnarray*} Cov\left(\sum_i a_i X_i,\sum_j b_j X_j\right) &=& \sum_i \sum_j a_i b_j \mathbb{E}\left(X_i X_j\right)\\ &=& \sum_i a_ib_i \mathbb{E}\left(X^2_i\right)\\ &=& \sigma^2 \sum_i a_ib_i\\ &=& 0, \end{eqnarray*} where the first equality follows from the fact that the means of both random variables are zero and the linearity of covariance, and the second line follows from the independence of $X_i$ and $X_j$ for $i\neq j$. Two Gaussian random variables are independent if and only if their covariance is zero -- this is not true in general, but it is true for Gaussian distributions.