[Math] Eigenvalues from Symmetric Matrix

diagonalizationeigenvalues-eigenvectorsmatricesreal numbers

I found a past final exam from here

Consider the real matrix

$
A =
\left[ {\begin{array}{ccc}
1 & 0 & 1 \\
0 & 1 & 0 \\
1 & 0 & 1 \\
\end{array} } \right]
$

(a) Explain what property the matrix A has, which assures that you can diagonalize it without the help of complex matrix.

(b) find a real matrix S and a diagonal matrix D, such that

$A=SDS^T$

For part (a), I thought the matrix is symmetric, but I am not sure how to find the eigenvalue for that..

For part (b), how do I get the eigenvalue from A?

Best Answer

Recall that the columns of a transformation matrix are the images of the basis and that when you right-multiply a matrix by a vector, the result is a linear combination of the columns of the matrix with coefficients given by the components of the vector.

The second column of $A$ is $(0,1,0)^T$, so that standard basis vector gets mapped to itself: it is an eigenvector of $1$. The sum of the first and third columns is $(2,0,2)^T=2(1,0,1)^T$, so $(1,0,1)$ is an eigenvector of $2$. Since the sum of the eigenvalues is equal to the trace, you get the third eigenvalue for free: it’s $1+1+1-1-2=0$, but then, we already knew that $0$ is an eigenvalue because the matrix has two identical columns, therefore has a nontrivial null space. You can either compute a basis for this null space to find an eigenvector of $0$ or notice that because the first and third columns are identical, their difference, i.e., the product of the matrix with $(1,0,-1)^T$, is $0$.

Since the problem wants an orthogonal diagonalization—$SDS^T$ instead of $SDS^{-1}$—you’ll need to orthonormalize the above eigenvectors. They’re already mutually orthogonal (as eigenvectors of a symmetric real matrix of different eigenvalues are), so all you need to do is normalize them.