[Math] Transforming a matrix to a positive-definite matrix

linear algebramatricespositive definite

while reading this article I came across following:

$[C]$ is a symmetric and not positive definite matrix. However a positive definite matrix $[C']$ can be obtained by equation:

$[C'] = [\phi]*[\lambda '] * [\phi]^{T}$

$[\phi]$ is the square matrix consisting of the eigenvectors of $[C]$. $[\lambda ']$ is the diagonal matrix containing the eigenvalues of $[C]$, however all negative eigenvalues are set to zero.

The result is the matrix $[C']$ which is positive definite.

Why does this work?

Best Answer

In the following I will use the standard notations.

Let's say our matrix in question is $A$ and it has the eigenvectors $v_1, v_2,...,v_n$ with the corresponding eigenvalues $\lambda_1,\lambda_2,...,\lambda_n$. Then we can write (as this is the definition of eigenvalues and eigenvectors) for each $i\in[1,n]$:

$$ A\cdot v_i = \lambda_i\cdot v_i $$

(see https://en.wikipedia.org/wiki/Eigenvalue#Overview).

If we take the matrix $V$ where each column is a $v_i$, we can write

$$ A\cdot V = V\cdot diag(\lambda) $$

where $diag(\lambda)$ is the diagonal matrix, where all the $\lambda_i$ are on the diagonal.

$V$ is an orthogonal matrix, meaning that its columns and rows are orthogonal to each other, so (with $I$ being the identiy matrix)

\begin{align} V^{T}\cdot V &= I\\ V^{T} &= V^{-1} \end{align}

So we can write

\begin{align} A\cdot V &= V\cdot diag(\lambda)\\ A &= V\cdot diag(\lambda)\cdot V^{-1} = V\cdot diag(\lambda)\cdot V^{T} \end{align}

See also at https://en.wikipedia.org/wiki/Symmetric_matrix#Decomposition the point that a complex symmetric matrix can be diagonalized by unitary congruence.

Here the authors modified $diag(\lambda)$ and rebuild the whole thing to $A'$.

An alternative is given by http://animalbiosciences.uoguelph.ca/~lrs/ELARES/PDforce.pdf

Related Question