[Math] Spectral Decomposition Theorem for Symmetric Matrices Converse

eigenvalues-eigenvectorslinear algebrasvd

In my class, it was stated that any $n \times n$ symmetric matrix $\mathbf{X}$ may be written as $$\mathbf{X} = \mathbf{P}\boldsymbol{\Lambda}\mathbf{P}^{\prime}$$
where $\mathbf{P}\mathbf{P}^{\prime} = \mathbf{P}^{\prime}\mathbf{P} = \mathbf{I}$, $\boldsymbol{\Lambda} = \text{Diag}(\lambda_1, \dots, \lambda_n)$ are the eigenvalues of $\mathbf{X}$, and that the eigenvalues of a symmetric matrix $\mathbf{X}$ are non-negative if and only if $\mathbf{X}$ is non-negative definite.

My question is, though, if we have a matrix $$\mathbf{Y} = \mathbf{P}\boldsymbol{\Lambda}\mathbf{P}^{\prime}$$
such that $\boldsymbol{\Lambda}$ is a diagonal matrix with non-negative entries and $\mathbf{P}\mathbf{P}^{\prime} = \mathbf{P}^{\prime}\mathbf{P} = \mathbf{I}$, does it follow that $\mathbf{Y}$ has eigenvalues given by the diagonal entries of $\boldsymbol{\Lambda}$ and is thus symmetric and non-negative definite?

Note that I wasn't given a proof of the spectral decomposition theorem in class, and we're only to use the result. But I've thought about trying to prove this claim.

Symmetry isn't obvious – since $\mathbf{P}$ isn't necessarily $\mathbf{P}^{\prime}$. As for showing the eigenvalues, I'm not sure how to do this either.

For my purposes, a yes-no answer will suffice, but proofs are appreciated.

Best Answer

Yes! Note that $P' = P^{-1}$. In general, the eigenvalues of $P \Lambda P^{-1}$ are the same as the eigenvalues of $\Lambda$, even if $\Lambda$ is not diagonal and $P$ is not orthogonal. To see this, note that their characteristic functions are the same:

$$\det(tI - P \Lambda P^{-1}) = \det (tPIP^{-1} - P\Lambda P^{-1}) = \det( P(tI-\Lambda)P^{-1}) = \det (P) \det (tI-\Lambda) \det(P^{-1}) = \det (tI-\Lambda).$$


For symmetry, we do need orthogonality of $P$ and the fact that $\Lambda$ is diagonal. Then, $$(P\Lambda P')' = (P')' \Lambda' P' = P \Lambda P'.$$

Related Question