[Math] Prove that if $A$ has $n$ linearly independent eigenvectors, then so does $A^T$.

linear algebra

Prove that if $A$ has $n$ linearly independent eigenvectors, then so does $A^T$.

My approach:

Suppose $A$ has $n$ linearly independent eigenvectors. Then there exists an invertible matrix $P$ and a diagonal matrix $D$ such that $A=PDP^{-1}$. So $A^T=(PDP^{-1})^T=(P^{-1})^TD^TP^T=(P^T)^{-1}D^TP^T.$

Since $D$ is a diagonal matrix, $D^T=D$.

I'm having trouble proceeding from here, how can I show that the eigenvectors remain linearly independent after transposing?

I was thinking of referring to the Invertible Matrix Theorem:

Since the columns of $P$ are eigenvectors and $P$ is invertible, $P^T$ is invertible as well, and hence has linearly independent vectors formed by the column transposes of $P$.

Best Answer

You're done. The statement that $A = P D P^{1}$ can be rewritten as $$ AP = PD, $$ which, read column by column, says that the $i$th col of $P$ is an eigenvector of $A$ with eigenvalue $d_{ii}$.

Now let's conjure up a corresponding statement for $A^T$, i.e., something of the form $$ A^T M = M E $$ where $E$ is diagonal, and we'll be done.

Letting $Q = P^{-1}$, you have \begin{align} A^T&=(P^{-1})^T D^T P^T \\ &=Q^T D^T {Q^{-1}}^T \\ &=Q^T D {Q^{T}}^{-1} \end{align} so that \begin{align} A^T Q^T&= Q^T D\\ \end{align}

That says that each column of $Q^T$ is an eigenvector of $A^T$, with eigenvalue $d_{ii}$, just by reading the statement one column at a time.