[Math] Are positive semi-definite matrices always covariance matrices

linear algebramatricesprobability

This may be trivial.

While covariance matrices of random variables are positive semi-definite, does the converse hold true as well, that positive semi-definite matrices are also valid covariance matrices of random variables?

Wikipedia says this is the case, however, I don't follow the argument:

… the covariance matrix of a multivariate probability distribution is always positive semi-definite. (…) Conversely, every positive semi-definite matrix is the covariance matrix of some multivariate distribution.

Grateful for any explanations.

Best Answer

If $X$ is a multivariate distribution (dimension N), and if $A$ is a positive semidefinite $N\times N$ matrix, then $Y=AX$ has covariance matrix cov($Y$) related to the covariance matrix cov($X$) of $X$ by $$ \mbox{cov}(Y)=A\;\mbox{cov}(X)\,A^{T}. $$ So if you start with independent components of $X$ so that cov($X$)=I, then $$ \mbox{cov}(Y)=AA^{T}. $$ Then, by arguing that any positive semidefinite matrix $M$ can be written as $AA^{T}$, you end up with $Y$ whose covariance matrix is $M$. In fact, you can write $M=A^{2}$ with $A=A^{T}$, which isn't too hard to show by choosing an orthonormal basis of eigenvectors for $M$ (which is one form of the spectral theorem.)