Diagonalizable by Symmetric Matrices

diagonalizationmatrices

Can all real diagonalizable matrices with real eigenvalues be diagonalized by a symmetric matrix? That is, if $A$ is a real diagonalizable matrix with real eigenvalues, can we write

$$ A = S D S^{-1} $$

for some real symmetric matrix $S$ and some real diagonal matrix $D$?

Best Answer

Let us consider the more general setting in which $A$ is an $n\times n$ diagonalisable matrix over a field $\mathbb F$. By assumption, $A$ admits a diagonalisation $A=VD_1V^{-1}$. We are asking whether there exist a symmetric matrix $S$ and a diagonal matrix $D$ over $\mathbb F$ such that $VD_1V^{-1} = SDS^{-1}$. Since the two sides have the same spectrum, $D_1=PDP^T$ for some permutation $P$. Thus the equation can be rewritten as $$ (VP)D(VP)^{-1} = SDS^{-1}.\tag{1} $$ We now consider two cases:

  1. $n=2$. Then $(1)$ is always solvable. This is obvious if $A$ is a scalar matrix (so that $D=\lambda I$). Suppose $A$ has two distinct eigenvalues. Then all eigenspaces of $A$ are one-dimensional. Therefore, $(1)$ is solvable if and only if $$ S = VP\Lambda\tag{2} $$ for some nonsingular diagonal matrix $\Lambda$. Since $V$ is nonsingular, it has at most two zero entries but not any zero row/column. Therefore, there always exists a permutation $P$ such that $VP$ is in one of the following forms, where $a,b,c,d\ne0$: $$ \pmatrix{a&b\\ c&d}, \ \pmatrix{0&b\\ c&d}, \ \pmatrix{a&b\\ c&0}, \ \pmatrix{0&b\\ c&0}. $$ It follows that $VP\Lambda$ is symmetric when $\Lambda=\operatorname{diag}\left(1,\frac{c}{b}\right)$.
  2. $n\ge3$ and $\mathbb F$ has at least $n$ elements. Then $(1)$ is not always solvable. First, we claim that there exists an invertible and entrywise nonzero matrix of every order $n$. We can prove it by mathematical induction. The base case $n=1$ is trivial. In the inductive step, suppose $B\in GL_{n-1}(\mathbb F)$ is entrywise nonzero. Since $\mathbb F$ has at least $n\ge3$ elements, some two of them, say $b_1$ and $b_2$, are nonzero. Therefore $$ \det\pmatrix{B&\mathbf1^T\\ \mathbf1&b}=b\det(B)+\text{constant} $$ is nonzero for some $b\in\{b_1,b_2\}$ and our claim is proved. Now consider a diagonalisable matrix $A$ with distinct eigenvalues (such an $A$ always exists because $\mathbb F$ has at least $n$ elements) and suppose its eigenvectors are given by the columns of $$ V=\pmatrix{1&0\\ \mathbf1&B} $$ where $B$ is an invertible and entrywise nonzero matrix of order $n-1$. As $A$ has distinct eigenvalues, the problem again boils down to solving $(2)$ for $S,P$ and $\Lambda$. However, should $(2)$ be solvable, $VP$ must be sign-symmetric (by a sign, we mean whether an entry is nonzero). Yet, this is impossible, because $VP$ always contains an off-diagonal zero entry on the first row but none in any row below.

The remaining case where $n\ge3$ and $\mathbb F$ has fewer than $n$ elements is more intricate. Since $A$ must contain some repeated eigenvalues, the above counterexample does not apply.