Algorithm to find orthnormal eigenvectors of a symmetric matrix

linear algebranumerical linear algebranumerical methods

I have a symmetric matrix $S$ and I'm trying to implement the following algorithm to find first $k$ orthnormal eigenvectors

enter image description here
Note: The picture is from http://www.wisdom.weizmann.ac.il/~harel/papers/highdimensionalGD.pdf

I use a very simple $2×2$ matrix for tests:

$$
\begin{matrix}
1 & 2 \\
2 & 3 \\
\end{matrix}
$$

The code finds the first eigenvector without problems, but it gets stuck on the second eigenvector.

The Gram Schmidt process makes the second vector orthogonal to the previous eigenvector, but then matrix multiplication "flips" candidate around, and they fight in never-ending loop.

Here gray line is the first eigenvector, the thick red is the next candidate $\hat{u_{i}}$

enter image description here

I spent one night debugging it and can't spot anything obviously wrong. It must be something trivial, but I don't understand what. Can you please help me? What am I missing?

https://jsbin.com/zufejir/5/edit?js,output – the code. Each click advances algorithm to the next state.

Best Answer

Eigenvalues can be non-positive real numbers or even complex numbers.

eig([1,2;2,3]) in Octave gives $[-0.23..., 4.23...]$ so the smallest one is indeed negative.

So what happens is that $Su$ will have opposite direction compared to $u$, and then you normalize it so vector will be flipped and of same length. You would circumnavigate this problem by changing stop condition to for example $(u_i^T \hat u_i)^2<(1-\epsilon)^2$

A better condition is probably to do $$\text{var}((\hat u_i)/u_i) < \epsilon$$ element wise as each point wise division should estimate $\lambda_i$, the eigenvalue, which could be a negative or even complex number.