[Math] 2×2 Matrix with real entries and a complex eigenvalue can’t be normal

linear algebra

Homework question.


Let $A$ be a $2 \times 2$ matrix with real entries. Suppose that $A$ has an eigenvalue $\lambda$ with the imaginary part of $\lambda \neq 0$. Is there an orthonormal basis of $\mathbb{C}_2$ consisting entirely of eigenvectors of $A$? Explain.


So far I've got the following:

A matrix is self-adjoint iff its eigenvalues are all real. By the spectral theorem, a linear transformation $T$ from $V$ to $V$ is normal iff its eigenvectors span $V$. A linear transformation is normal iff it commutes with its adjoint.

With these three facts, I can state that A is not self-adjoint, and I need to show that it is not normal, assuming that the proposition is false (my belief).

I've found a couple simple counterexamples, but I can't figure out how to make the jump from not self-adjoint to not-normal. (Or any other method of showing it's not normal).

Best Answer

A matrix whose entries are real has a characteristic polynomial whose coefficients are all real. Any non-real eigenvalues must then come in complex-conjugate pairs, and the complex conjugate of an eigenvector $v$ for $\lambda$ must be an eigenvector for $\overline{\lambda}$.
If the matrix $A$ is normal, $v$ and $\overline{v}$ must be orthogonal to each other, which says that $v^T v = 0$. So in the $2 \times 2$ case the eigenvectors must be of the form $\pmatrix{t\cr \pm i t\cr}$. If $\lambda = a + i b$ is any non-real complex number, the normal matrix with normalized eigenvectors $\pmatrix{1\cr + i\cr}/\sqrt{2}$ for $\lambda$ and $\pmatrix{1\cr - i\cr}/\sqrt{2}$ for $\overline{\lambda} = a - i b$ is $$ A = \frac{a + i b}{2} \pmatrix{1\cr i\cr} (1, -i) + \frac{a - i b}{2} \pmatrix{1\cr -i\cr} (1, i) = \pmatrix{a & b\cr -b & a\cr}$$

Related Question