[Math] Eigenvalues and eigenvectors – uniqueness

eigenvalues-eigenvectorslinear algebra

Suppose I have a square $n\times n$ matrix A with $n$ linearly independent eigenvectors.

Clearly more than one matrix can share the same eigenvectors and eigenvalues.

However, I also know that I can write this matrix A in the form D = P$^{-1}$AP, where D is a diagonal matrix with diagonal entries equal to the eigenvalues and the columns of P are the eigenvectors of A.

However, in the other direction, if I know the eigenvalues and eigenvectors of a matrix A, then I can form the matrices P and D using the above. However, this seems to suggest that given eigenvalues and eigenvectors I can find a single matrix that corresponds to these values…

So when are eigenvalues and eigenvetors unique and not unique? What am I missing?

Best Answer

Let me try a slightly more abstract answer.

If you know $n$ distinct eigenvalues and an eigenvector for each (not just the set of eigenvectors) for a linear transformation $T$ then those eigenvectors will be linearly independent and so form a basis. In that basis the matrix of the transformation will be diagonal.

If you change bases using $P$ as in your question then in the new coordinate system $T$ will have the transformed matrix. But it will still be the same transformation, with the same eigenvalues and eigenvectors.

So matrices can have the same eigenvectors and eigenvalues when they represent the same linear transformation, just written in different coordinate systems. The particular coordinate system that writes it as a diagonal matrix is the one you'd rather use since it makes the calculations easier and more transparent.

Related Question