[Math] Is matrix diagonalization unique

linear algebra

From the following statement, it seems matrix diagonalization is just eigen decomposition.

Diagonalizing a matrix is also equivalent to finding the matrix's eigenvalues, which turn out to be precisely the entries of the diagonalized matrix. Similarly, the eigenvectors make up the new set of axes corresponding to the diagonal matrix.

http://mathworld.wolfram.com/MatrixDiagonalization.html

However, from what I have learned, Spectral Theorem is closest to this conclusion. But how the spectral theorem is related to it, or is there some other theorem grants this statement?

Spectral Theorem: Suppose that $V$ is a complex inner-product space and $T \in L(V)$. Then $V$ has an orthonormal basis consisting of eigenvectors of $T$ if and only if $T$ is normal.

Best Answer

Matrix diagonalization is more general than the spectral theorem. For instance, you may not be in an inner product space, and it still may be helpful to diagonalize a matrix. Not every matrix can be diagonalized, though; for instance,

$$\left[\begin{matrix} 1 & 1 \\ 0 & 1 \end {matrix}\right]$$

has eigenvalues 1 and 1, but cannot be diagonalized.

The spectral theorem tells you that in a certain situation, you are guaranteed to be able to diagonalize. Even better, the eigenvectors have some extra structure: they are orthogonal to each other.

If a matrix is diagonalized, its diagonal form is unique, up to a permutation of the diagonal entries. This is because the entries on the diagonal must be all the eigenvalues. For instance,

$$\left[\begin{matrix} 1 & 0 & 0\\ 0 & 2 & 0 \\ 0 & 0 & 1 \end {matrix}\right] \text { and }\left[\begin{matrix} 1 & 0 & 0\\ 0 & 1 & 0 \\ 0 & 0 & 2 \end {matrix}\right]$$

are examples of two different ways to diagonalize the same matrix.

Related Question