Linear Algebra – Why Can a Hermitian Matrix $A$ Be Written as $\sum_{j=1}^{N} \lambda_j|u_j\rangle\langle u_j|$?

linear algebraquantum mechanics

I've seen this being used in several quantum mechanics texts but I'm not sure of the reason:

Why are we able to write $N\times N$ Hermitian matrices like $A$
having eigenvalues $\lambda_j$ and eigenvectors $|u_j\rangle$ in the
form $A = \sum_{j = 1}^{N} \lambda_j|u_j\rangle\langle u_j|$ (I'm following the Dirac notation)? Also, is this fact true only for Hermitian matrices?

Could someone point me towards a proof?

Best Answer

If you have learned that every Hermitian matrix $\boldsymbol A \in \mathrm M_N(\mathbb C)$ is unitarily similar to a diagonal matrix, then the conclusion would be quickly derived: since $\boldsymbol A = \boldsymbol U \boldsymbol {DU}^*$, where $$\boldsymbol D = \mathrm{diag} (\lambda_j)_1^N, \quad \boldsymbol U = \begin{bmatrix} \boldsymbol v_1 & \cdots &\boldsymbol v_n \end{bmatrix} $$ and $(\boldsymbol v_j)_1^N$ are eigenvectors written as $N \times 1$ matrices corresponding to $\lambda_j$, then $$ \boldsymbol A = \begin{bmatrix} \boldsymbol v_1 &\boldsymbol v_2 & \cdots& \boldsymbol v_n \end{bmatrix} \begin{bmatrix} \lambda_1 & & &\\ & \lambda_2 &&\\ &&\ddots &\\ &&&\lambda_n \end{bmatrix} \begin{bmatrix} \boldsymbol v_1^* \\ \boldsymbol v_2^*\\ \vdots \\\boldsymbol v_n^* \end{bmatrix}, $$ and apply the multiplication of blocked matrices, we have $$ \boldsymbol A = \sum_1^N \lambda_j \boldsymbol v_j \boldsymbol v_j^*, $$ or write this in "bra-ket" notation if you want.

Moreover, if given matrix is unitarily diagonalizable, i.e. unitarily similar to some diagonal matrix, then you can always do the things stated above and obtain a similar decomposition.