Is the proof that matrices are diagonalizable iff they have a basis of eigenvectors correct

linear algebra

Question: Show that an $n \times n$ matrix $A$ is diagonalizable if and only if $\mathbb R ^n$ has a basis of eigenvectors of $A$.

My try: If $A$ is diagonalizable, there exists a nonsingular matrix $P$ such that $P^{-1}AP=D$, where $D$ is diagonal containing the eigenvalues of $A$ with corresponding eigenvectors being the columns of $P$. Also, $P^{-1}AP=D$ means that $AP=PD$, that is, $A$ acts in the basis formed by columns $P$ as diagonal matrix $(D)$. If, however, $\mathbb R^n$ does not have a basis of eigenvectors of $A$, it would not be possible to have eigenvectors as columns of $P$. Thus, $\mathbb R^n$ needs to have a basis of eigenvectors of $A$. Hence proved.

Best Answer

If $\mathbb R ^n$ has a basis $\{p_1,\dots,p_n\}$ of eigenvectors of $A$, you can store them as columns in the matrix $P\in\mathbb{R}^{n\times n}$. Since $\{p_1,\dots,p_n\}$ is a basis, $p_i^Tp_j=1$ if $i=j$ and $0$ otherwise. This directly implies $P^TP=I$, so $P^T=P^{-1}$.

Since $p_i$ is eigenvector for any $i=1,\dots n$ to some eigenvalue $\lambda_i$, it holds $Ap_i=\lambda_ip_i$. This implies $AP=PD$ where $D$ is the diagonalmatrix which stores $\lambda_i$ at the $i$-th diagonal entry by definition of matrix multiplication.

Multiplication of $P^T$ from the left to $AP=PD$ yields the assertion.