Eigenvalue decomposition: why are the eigenvectors columns of $Q$, and not some other vectors

eigenvalues-eigenvectorslinear algebramatricesmatrix decompositionsvd

Taken from Wikipedia, https://en.wikipedia.org/wiki/Eigendecomposition_of_a_matrix

Let $A$ be a square $n × n$ matrix with $n$ linearly independent eigenvectors $q_i$ (where $i = 1, …, n$). Then $A$ can be factorized as

${A}= {Q}{\Lambda}{Q}^{-1} $
where $Q$ is the square $n × n$ matrix whose $i$th column is the eigenvector $q_i$ of $A$, and $Λ$ is the diagonal matrix whose diagonal elements are the corresponding eigenvalues, $Λ_{ii} = λ_i$.


But why is $Q$ made up of the eigenvectors?

Recall that a general theorem says, given any real symmetric matrix $A$, there exists some orthogonal matrix $Q$ such that $ {A} ={Q} {\Lambda } {Q} ^{-1}$. Emphasis on exists.

So why can't $Q$ be some other orthogonal matrix not associated with the eigenvectors of $A$?

Best Answer

Let us write $Q = (q_1 \, | \, \dots \, | \, q_n)$ where $q_i$ are the columns of $Q$. Since $Q$ is invertible, we have

$$ A = Q \Lambda Q^{-1} \iff AQ = Q \Lambda $$

so the columns of $AQ$ and $Q\Lambda$ must be identical. By the definition of matrix multiplication and the fact that $\Lambda$ is diagonal, the $i$-th column of $Q\Lambda$ is $\lambda_i q_i$ while the $i$-th column of $AQ$ is $Aq_i$. Hence, $A = Q \Lambda Q^{-1}$ iff $Aq_i = \lambda_i q_i$ for all $1 \leq i \leq n$ so for the identify $A = Q \Lambda Q^{-1}$ to hold, the columns of $Q$ must be eigenvectors of $A$ (and the diagonal entries of $\Lambda$ must be eigenvales of $A$).

Related Question