Solved – Mutual Independence in a Multivariate Normal with Identity Covariance

distributionsindependencemultivariate normal distributionnormal distributionprobability

Consider a random vector $X$ which follows a multivariate nomal with zero means and Identity Covariance.

$X\sim \mathcal{N}_n(\mathbf 0, \mathbf I)$

We can say that the individual variables $X_1, X_2, \cdots, X_n$ are pair-wisely independent, since in the case of variable pairs in a multivariate normal, zero correlation implies independence (see [1]).

However, I am struggling to prove mutual independence of the individual variables. Note that pair-wise independence does not generally imply mutual independence, as the latter is a stronger condition (see these two links). In other words, I want to prove that not only any pair from $X_1, X_2, \cdots, X_n$ are independent but that any individual $X_i$ is independent of any intersection of the remaining variables.

I also suspect that mutual independence holds for any diagonal covariance matrix…
$$
$$

[1] Robert V. Hogg, Joseph W. McKean, and Allen T. Craig. "Introduction to Mathematical Statistics, 7th Edition". In: Pearson, 2013, pp. 182-183. ISBN: 978-0-321-84943-4.

enter image description here

Best Answer

Here is a partial answer. Suppose $D$ is a diagonal matrix whose diagonal entries $d_1,\ldots,d_n$ are positive. Let $$ \mathbf y' = (y_1,\ldots, y_n). $$ Then \begin{align} & \exp\left( \frac{-1}2 \mathbf y'D^{-1} \mathbf y \right) \\[8pt] = {} & \exp\left( \frac{-1} 2 \sum_{i=1}^n \frac{y_i^2}{d_i} \right) \\[8pt] = {} & \prod_{i=1}^n \exp\left( \frac{-1} 2 \frac{y_i^2}{d_i} \right). \end{align} This stops short of treating the case in which $$ D = \operatorname{var}(Y) = \operatorname E(YY') \in \mathbb R^{n\times n} $$ is singular.

Related Question