[Math] Correlation matrix from Covariance matrix

linear algebrastatistics

This is for a project which I've been trying to find some information for Covariance matrix and correlation matrix.

I understand that for a $n \times n$ matrix $A, AA^T$ will give me the covariance matrix.

Is there any relationship between the covariance and correlation matrix?

Sorry maybe I wasn't clear.

I wanted to use Cholesky decomposition to generate correlated variables from random variables. I do know how to do it using matlab. And I understand how it works for 2 variables. But when I scale up the matrix to $n \times n$ instead of $2 \times 2$, I am not sure how it will work out.

would appreciate if someone could provide more hint on the mathematics.

Best Answer

From a matrix algebra point of view the answer is fairly simple. Assume your covariance matrix is $\Sigma$ and let

$$ D =\sqrt{ \text{diag}\left( {\Sigma} \right)} $$

then the correlation matrix is given by $$ \varrho = D^{-1}\Sigma D^{-1} $$

Edit: fixed to include square root

Related Question