[Math] Help with proving that vectors are conjugate.

eigenvalues-eigenvectorslinear algebra

I've been looking through a few of my lecture notes on optimization, and there was a thing that confused me about the conjugate gradient methods.

In the following definition we assume that the $n\times n$ matrix $A$ is positive definite.

Definition: Nonzero vectors $d_i \in \mathbb{R^n}$, $i=0,1,…,m$ are called conjugate if $d_i^TAd_j=0$, $\forall\text{ } i\neq j$.

Properties:

  1. $d_i^TAd_i>0$;

  2. Conjugate directions $d_0,…,d_m$ are linearly independent;

  3. If $d_0,d_1,…,d_{n-1}$ is a complete set of eigenvectors of $A$, then they are conjugate. (Exercise: prove it)

What I don't really understand is how to prove property 3. Let's say we have a complete set of eigenvectors, and let's say we take two linearly independent of these eigenvectors, $d_1$ and $d_2$, corresponding to eigenvalues $\lambda_1$ and $\lambda_2$, respectively. Let's assume that $\lambda_2 \neq 0$. Then we get:
$$d_1^TAd_2=d_1^T \cdot (\lambda_ 2d_2)= \lambda_2(d_1^Td_2)$$

Now we should get that $d_1^Td_2=0$ if they're conjugate, according to the definition. This should mean that they are orthogonal though, if I'm not mistaken. How can I be certain that they are? The matrix is just positive definite, and there is no mention of it being Hermitian or anything like that, so how should I be able to prove that there exists an ON-basis for $A$ consisting of eigenvectors, which I suppose is the only way to prove it?

Best Answer

Two eigenvectors belonging to two different eigenvalues of a symmetric matrix are always orthogonal. This is because

$$\lambda_1 v_1^T v_2 = (\lambda_1 v_1)^T v_2 = (Av_1)^T v_2 = v_1^T A^T v_2 = v_1^T A v_2 = v_1^T (Av_2) = v_1^T(\lambda_2 v_2) = \lambda_2 v_1^T v_2$$

Thus, you have an equality $\lambda_1 x = \lambda_2 x$ which is only possible if either $\lambda_1 = \lambda_2$ or $x=0$.

Related Question