Say I have $X \sim N(\mu_1, \sigma_1^2)$ and $Y \sim N(\mu_2, \sigma_2^2)$, also $X$ and $Y$ are independent, then is the joint distribution of $X$ and $Y$ multivariate normal? I.e.,
$$\begin{bmatrix} X \\ Y\end{bmatrix} \sim N\left(\begin{bmatrix} \mu_1 \\ \mu_2 \end{bmatrix}, \begin{bmatrix} \sigma_1^2 & 0 \\ 0 & \sigma_2^2 \end{bmatrix} \right) $$
If so, why?
Best Answer
One characterization of multivariate normality that is often taken to be the definition is that the tuple $(X_1,\ldots,X_n)$ has a multivariate normal distribution if for every tuple $(c_1,\ldots,c_n)$ of constants (i.e. non-random scalars), the linear combination $c_1 X_1+\cdots+c_nX_n$ has a univariate normal distribution. If $X\sim N(\mu_1,\sigma_1^2)$ and $a$ is constant, then $aX\sim N(a\mu_1,a^2\sigma_1^2)$. If $aX\sim N(a\mu_1,\sigma_1^2)$ and $Y\sim N(a\mu_2,a^2\sigma_2^2)$ then the distribution of $X+Y$ is that of the sum of two independent normally distributed random variables. Its expected value can be found without knowing that they are independent; its variance can be found if you know only that they are uncorrelated. But the fact that the sum is normally distibuted relies on the assumption that they are independent.