[Math] Existence and uniqueness of the eigen decomposition of a square matrix

eigenvalues-eigenvectorslinear algebramatrices

I'm confused on the sufficient conditions for the existence and uniqueness of the eigen decomposition of a square matrix.

Consider a matrix $A$ of dimension $m\times m$, a matrix $B$ of dimension $m\times m$ and a matrix $D$ diagonal of dimension $m\times m$.

Assumption 1: $B$ invertible

Assumption 2: The diagonal elements of $D$ are all distinct

Assumption 3: $A=BDB^{-1}$ where $B^{-1}$ exists by Assumption 1

Questions:

(1) Does Assumption 3 mean that $BDB^{-1}$ is the eigen decomposition of $A$? In other words, does Assumption 3 is equivalent to say that the columns of $B$ are the eigenvectors of $A$ and the diagonal elements of $D$ are the eigenvalues of $A$? Or do we need other assumptions to state that?

My doubt is that: if Assumption 3 means that the columns of $B$ are the eigenvectors of $A$ and the diagonal elements of $D$ are the eigenvalues of $A$, then, since $B$ is invertible, it should be that the eigenvectors of $A$ are linearly independent and, hence, that $A$ is invertible (which is not among my assumptions).

(2) From what I have read in some sources, Assumptions 2 and 3 imply that the the eigen decomposition of $A$ is unique [up to a left multiplication of $B$ by a invertible diagonal matrix and up to an order for the eigenvalues]. What does "unique" exactly mean? My thought was that it means that there are no other matrices $E,F$ with $F$ diagonal such that $A=EFE^{-1}$? But if that is right, the uniqueness would be necessary to guarantee that the columns of $B$ are the eigenvectors of $A$ and the diagonal elements of $D$ are the eigenvalues of $A$; in other words would be "embedded" in saying that $BDB^{-1}$ is the eigen decomposition of $A$. Could you clarify this point?

Best Answer

(1) Let $e_i$ be the column vector with $i$th entry $1$ and all other entries $0$. Then $$A(Be_i) = BDB^{-1}(Be_i) = B(De_i) = Bd_{ii}e_i = d_{ii}(Be_i)$$ Therefore $Be_i$ is an eigenvector of $A$ with eigenvalue $d_{ii}$.

(2) As is explained the parenthetical, unique means

  1. In $A = EFE^{-1}$, the diagonal elements of $F$ will be the same as the diagonal elements as $D$ has, though not necessarily in the same order. (Since $F$ is assumed diagonal, its off-diagonal elements are all $0$, just like $D$.)
  2. If the elements of $F$ are arranged to match $D$ (i.e., if $F = D$), then $E = BQ$ for some invertible diagonal matrix $Q$ (i.e., none of the diagonal elements is $0$). (And note that it is right-multiplication, not left. Left works on rows, not columns.)

This does follow from (1), because the relation $A= EFE^{-1}$ implies that the diagonal elements of $F$ are also eigenvalues of $A$, with the column vectors of $E$ as eigenvectors. The fact that $E$ is invertible implies that $\{Ee_i\}$ spans space. Therefore every eigenvalue must be represented in the diagonal elements of $F$, just as in $D$. Since by assumption 2 they are all distinct, they must be exactly the same.

Now if we arrange the eigenvectors in $F$ in the same order as $D$, then $F = D$, and $A = EDE^{-1}$. Because the eigenvalues are distinct, all of the eigenspaces are one-dimensional. So the $i$th columns of $E$ and $B$ are both eigenvectors for the same eigenvalue, and so must be parallel. Collecting the scalar multipliers together as the diagonal elements of a diagonal matrix $Q$ gives $E = BQ$.

Related Question