[Math] Cholesky Decomposition and Orthogonalization

eigenvalues-eigenvectorslinear algebramatrices

I recently came across a methodology for orthogonalizing variables that are collinear, that uses Cholesky Decomposition, but I am not entirely grasping the intuition of it.

Let' assume we have three variables/factors : $X_1$, $X_2$, and $X_3$.
Let $V$ be a full rank matrix, of which the columns are $X_1$, $X_2$ and $X_3$. We wish to orthogonalize the columns of the matrix $V$.

Here is the methodology:

We have that $V'V$ is Hermitian positive definite, then, using Cholesky Decomposition, we can write:

$V'V = LL'$

The lower triangular matrix $L$ with strictly positive diagonal entries is invertible. Then the columns of the matrix:

$U = V(L^{-1})'$ are orthonormal

Can someone please explain why this last result holds?

Thanks a lot!

Best Answer

What you described is a very indirect way of computing the QR decomposition (aka the Gram Schmidt process). Let us write the QR decomposition as $V = QR$. Then $V'V = R'Q'QR = R'R = LL'$. Therefore to recover the orthonormal $Q$, we can do $Q = VR^{-1}$, which is the last result. You're better off computing $V = QR$ directly.