[Math] Is there some sort of relationship between SVD decomposition and diagonalizability

linear algebramatrices

If $L$ is a linear operator acting on a hilbert space $H$ of dimension $n$ ( $L: H \to H$ ), then I know the following

  1. If $L$ is normal operator then in some orthonormal basis $B$, matrix representation of $L$, $[L]_{B}$ will be a diagonal matrix.
  2. If sum of dimensions of eigen spaces of $L$ is equal to $n$ then in some non-orthonormal basis matrix representation of $L$ is diagonal.
  3. In general for any $L$, if its matrix representation in a basis $B$ is $[L]_B$ then $[L_B] = UDV$ where $U,V$ are unitary matrices and $D$ a diagonal matrix.

I can see that point $1$ and $2$ are the same thing ie if $L$ is normal operator. Is there any other relationship between SVD decomposition and diagonalizability ?

Best Answer

I think the best way to describe the relationship between SVD of a matrix (I'll just use $A$) and diagonalizability, and what makes it possible for every matrix to have a SVD, is because it is more closely related to the eigendecomposition of $AA^*$ and $A^*A$ (these matrices are positive semidefinite and is therefore always unitarily diagonalizable) than to $A$ itself. Notice that for $A=UDV$ we have \begin{equation} AA^*=UDVV^*D^*U^*=UD^2U^*\end{equation} and similarly we have $A^*A=V^*D^2V$. So $D$ is actually the square root of the eigenvalues of $AA^*$ and $A^*A$. Furthermore $U$ consists of eigenvectors for $AA^*$ and similarly $V$ consists of eigenvectors for $A^*A$.