[Math] $AA^T=A^TA$ and othogonallity of eigenvectors

eigenvalues-eigenvectorslinear algebra

In his Unit 3 Exam Review, Strang gives the theorem that a matrix $A$ has orthogonal eigenvectors if and only if $AA^T=A^TA$, and then lists symmetric, antisymmetric and orthogonal matrices as meeting this condition.

However, I know that in the symmetric case it's the eigenspaces that are guaranteed to be orthogonal, not the eigenvectors.

Is it correct to assume that he meant to say eigenspaces and not eigenvectors?

EDIT

In his lecture on Symmetric Matrices and Positive Definateness Strang says:

So — I have a — you should say "why?" and I'll at least answer why
for case one, maybe case two, the checking the Eigen — that the
eigenvectors are perpendicular, I'll leave to, the — to the book. But
let's just realize what — well, first I have to say, it — it could
happen, like for the identity matrix — there's a symmetric matrix.
Its eigenvalues are certainly all real, they're all one for the
identity matrix. What about the eigenvectors? Well, for the identity,
every vector is an eigenvector. So how can I say they're
perpendicular? What I really mean is the — they — this word are
should really be written can be chosen perpendicular. That is, if we
have — it's the usual case. If the eigenvalues are all different,
then each eigenvalue has one line of eigenvectors and those lines are
perpendicular here. But if an eigenvalue's repeated, then there's a
whole plane of eigenvectors and all I'm saying is that in that plain,
we can choose perpendicular ones.
So that's why it's a can be chosen
part, is — this is in the case of a repeated eigenvalue where there's
some real, substantial freedom. But the typical case is different
eigenvalues, all real, one dimensional eigenvector space, Eigen
spaces, and all perpendicular.

Best Answer

I'm assuming Strang is allowing complex eigenvectors. The statement "$A$ has orthogonal eigenvectors" is a bit imprecise; a better statement would be "You can choose a basis of pairwise orthogonal eigenvectors". If you take his statement literally then it is correct but trivial, since a large enough normal matrix ($AA^*=A^* A$) will have some pair of orthogonal eigenvectors.

The eigenspaces for distinct eigenvalues are orthogonal, but that's different from saying there exists an orthonormal basis consisting of eigenvectors. The eigenspaces might not sum to the whole space.