[Math] Finding the original matrix A from a given set of Eigenvalues and Eigenvectors

eigenvalues-eigenvectorslinear algebra

Say you're given a set of eigenvalues and eigenvectors, is it always possible to determine the original matrix A?

I recognize that if a matrix A is diagonalizable, the geometric multiplicity equals the algebraic multiplicity and you can form you matrix of Eigenvectors, call it Q, and then your diagonal matrix D (which consists of the eigenvalues that correspond to the correct eigenvector column in Q) gives you the result

$A = QDQ^-1$

but assuming we aren't given the original matrix A, is it possible to "reverse" this process and find A? Thanks

Best Answer

Assuming $A$ is $n\times n$, if you are given $n$ linear independent eigenvectors and the corresponding eigenvalues, then the $QDQ^{-1}$ formulation works.

If the matrix does not have this spanning set of eigenvectors, then you need the generalized eigenvectors, c.f. https://en.wikipedia.org/wiki/Generalized_eigenvector, to construct a matrix $P$ and the almost diagonal Jordan normal form $J$ such that $A=PJP^{-1}$.

Jordan normal forms, c.f. https://en.wikipedia.org/wiki/Jordan_normal_form, exist for all $n\times n$ matrices over the field $\mathbb{C}$, so the answer is "yes" if you are also given generalized eigenvectors and "no" if you are only given the eigenvectors of a non-diagonalizable matrix.