Solved – Independence of multivariate normal distribution

independencemultivariate analysisnormal distributionself-study

Consider the multivariate normal distribution $X = [X_1, X_2, X_3]^T$ with

$$ \mu = \begin{pmatrix} -3 \\ 1 \\ 4\end{pmatrix} \quad \quad \Sigma = \begin{pmatrix} 4 & -1 & 0\\ -1 & 5 & 0\\ 0 & 0 & 2\end{pmatrix}$$

I need to find out, if the following random variables are independent:

  • $(X_1, X_2)$ and $X_3$
  • $X_1 – X_2$ and $X_1 + X_2 – X_3$

How can I determine that? I see that e.g. $X_1$ and $X_3$ have covariance $= 0$. Normally, this doesn't imply, that they are independent, but (as I've learned from Wikipedia) since here $X_1$ and $X_2$ are normally and jointly distributed, it does in fact apply. But how do I cope with the sums and minuses?

Best Answer

Indeed, for the normal distribution, uncorrelatedness implies independence. For your first case, showing formally independence between the random vector $(X_1, X_2)$ and the scalar random variable $X_3$ can be done by showing that the conditional on $X_3$ mean and conditional covariance matrix of the random vector $(X_1, X_2)$ is equal to the unconditional mean and covariance matrix. Using notation as stated in this Wikipedia article,

$\Sigma$ is the $3\times 3$ covariance matrix of the joint distribution of your three variables. It is then unequally partitioned into sub-matrices. Denoting $v_{ij}$ the elements of $\Sigma$, we have

$$\Sigma_{11} =\left[\begin{matrix} v_{11} &v_{12} \\ v_{21} & v_{22} \end{matrix}\right] = \left[\begin{matrix} 4 &-1 \\ -1 & 5 \end{matrix}\right] $$

$$\Sigma_{12} =\left[\begin{matrix} v_{13} \\v_{23} \end{matrix}\right] = \left[\begin{matrix} 0 \\0 \end{matrix}\right] \qquad \Sigma_{21} = \Sigma_{12}'$$

$$\Sigma_{22} = v_{33} = \sigma_3^2 = 2 $$

Then the conditional expectation vector- function $E\Big[(X_1,X_2)\mid X_3)\Big]$ is

$$E\Big[(X_1,X_2)\mid X_3)\Big] = \left [\begin{matrix} \mu_1 \\ \\\mu_2\end{matrix} \right] + \Sigma_{12}\Sigma^{-1}_{22}(X_3 - \mu_3) $$

which is equal to the unconditional mean-vector since $\Sigma_{12} = \mathbb 0$.The analogous result can be easily shown to hold for the covariance matrix.

As for your second question, one can proceed as follows:

Define $Y_1 = X_1 - X_2$ and $Y_2 = X_1 + X_2 - X_3$. The $Y$'s are still normally distributed random variables. So independence is here too equivalent to uncorrelatedness, i.e. zero covariance. So all you have to do is calculate

$$\operatorname{Cov}(Y_1,Y_2) = E(Y_1Y_2) - E(Y_1)E(Y_2)$$

and see whether it equals zero (normally, it won't).

Related Question