[Math] Linear Algebra, proof about eigenvalues

eigenvalues-eigenvectorslinear algebramatrices

I've been trying to complete a proof for a while now, but I can't. It would be great if someone could finish it for me so that I can at least learn from the solution.

The problem is asked like this:

Show that $\lambda$ is an eigenvalue of A iff $\lambda$ is eigenvalue of A transpose.

There's a hint and it says:

For any $\lambda$, $(A-\lambda I )^T = A^T-\lambda I$. By a theorem (which one?), $A^T – \lambda I$ is invertible iff $A-\lambda I$ is invertible.

I know that there is a theorem, in this book called the Invertible Matrix Theorem, which says that if A is invertible then so is its transpose. But I don't see what that has to do with anything? If A has an eigenvalue, then $A-\lambda I$ is linearly dependent and is not even invertible! Is that what the hint is all about? Well, I still don't see what that says about the solution set. In order to complete this proof, I think, we need to show that the solution set of the homogenous equation $A-\lambda I = 0$ is the same as that for its transpose.

Many thanks.

Best Answer

Use the definition of being an eigenvalue and the invertible matrix theorem mentioned in your post: $\lambda$ is an eigenvalue of $A$ if and only if $A-\lambda I$ is not invertible if and only if $A^T-\lambda I$ is not invertible if and only if $\lambda$ is an eigenvalue of $A^T$.

Related Question