Linear Algebra – How to Decompose a Real Symmetric Matrix

linear algebra

Prove that, without using induction, A real symmetric matrix $A$ can be decomposed as $A = Q^T \Lambda Q$, where $Q$ is an orthogonal matrix and $\Lambda$ is a diagonal matrix with eigenvalues of $A$ as its diagonal elements.

I can see that all eigenvalues of $A$ are real, and the corresponding eigenvectors are orthogonal, but I failed to see that when putting all (interesting) eigenvectors together, they form a basis of $\mathbb{R}^n$.

Edit

The reason I asked this question is to show that a real symmetric matrix is diagonalizable, so let's not use that fact for a while. Other than that, any undergraduate level linear algebra can be used.

Edit 2

After reading Algebraic Pavel's answer, I feel like ruling out Schur Decomposition as well, but I can't keep ruling out theorems, so…if a proof is too obvious, that's probably not what I am looking for, though, it maybe a technically correct answer.

Thanks.

Best Answer

Provided that the Schur decomposition is an allowed tool:

Using the Schur decomposition, we have that there exists an orthogonal $Q$ and an upper triangular $R$ such that $A=QRQ^T$. Since $A$ is symmetric, $Q^TAQ=R$ is symmetric as well. Therefore $R$ is symmetric. A symmetric triangular matrix is necessarily diagonal.


There is also a neat theory behind tridiagonal matrices, which can help:

It is easy to show that for any real $A$ there is an orthogonal matrix $Q$ such that $Q^TAQ=H$, where $H$ is upper Hessenberg. If $A$ is symmetric it then follows that $H$ is symmetric as well and hence tridiagonal. Now if the tridiagonal matrix $H$ is unreduced (none of the upper and lower diagonal entries are zero), then the eigenvalues of $H$ (and therefore of $A$) are distinct. Equivalently, if $A$ has repeated eigenvalues then $H$ is reduced (some upper and lower (symmetrically) diagonal entries are zero and hence $H$ is a block diagonal matrix). Consequently, one must have that each repeated eigenvalue must be in different unreduced diagonal blocks of $H$. In each of these blocks we must find at least one eigenvector (actually, only one) and it is almost trivial to show that these eigenvectors must be linearly independent (therefore, we can orthogonalize them).