[Math] Scaling of a multivariate normal

normal distributionprobability

We know that if a variable $X$ is iid from a $N(\mu,\sigma^2)$,
the distribution of $X+b$ is $N(\mu+b,\sigma^2)$

If we scale the $X$ by a scaling factor $k$, the new distribution will be $N(k\mu+b,k^2\sigma^2)$.

Does the same principle applies for multivariate normal distributions?

What happens if the scaling factor is a matrix?

It's ok if you don't give a full answer but a nice reference would be nice

Thanks

Best Answer

Almost.

Suppose the variance of $X\in\mathbb R^{n\times 1}$ is the $n\times n$ matrix $\Sigma$.

Suppose $A$ is a $k\times n$ matrix.

Then the variance of $AX$ must obviously be a $k\times k$ matrix.

It is $A\Sigma A^T$.

The distribution is still multivariate normal.