[Math] Is $U=V$ in the SVD of a symmetric positive semidefinite matrix

linear algebramatricespositive-semidefinitesvdsymmetric matrices

Consider the SVD of matrix $A$:

$$A = U \Sigma V^\top$$

If $A$ is a symmetric, positive semidefinite real matrix, is there a guarantee that $U = V$?

Second question (out of curiosity): what is the minimum necessary condition for $U = V$?

Best Answer

Here is an attempt to provide a clear answer, building upon Arash's answer.

Primer:

  • Any matrix $A$ can be decomposed with Singular Value Decomposition (SVD) as $A = U \Sigma V^\top$. $U$ and $V$ are unitary matrices. This decomposition is not unique: the singular values part $\Sigma$ is unique ; however the signs in the left and right singular vectors can be interchanged. Besides when at least one singular value is zero, there are many possible corresponding singular vectors. The following hold (source):

    • the singular values are equal to the square roots of the eigenvalues of $AA^\top$ (or the ones of $A^\top A$) (resp. $AA^*$ or $A^*A$ for complex matrices)
    • the right singular vectors (columns of $V$) are eigenvectors of $A^\top A$ (resp. $A^*A$)
    • the left singular vectors (columns of $U$) are eigenvectors of $AA^\top$ (resp. $AA^*$)
  • if $A$ is real symmetric then (spectral theorem) it is diagonalizable and therefore has at least one eigendecomposition $A = Q \Lambda Q^{-1} = Q \Lambda Q^\top $. (this post shows a non-diagonalizable counterexample of a complex symmetric matrix). In general this decomposition is not unique: the eigenvalues part $\Lambda$ is unique ; however the eigenvectors part $Q$ is only unique if no eigenvalue is zero.

  • so, if $A$ is real symmetric

    • its singular values are the absolute values (modulus if complex) of its eigenvalues.
    • both the right and left singular vectors (columns of $V$ and $U$) are eigenvectors of $A^\top A = AA^\top = A^2 = Q \Lambda^{2} Q^{-1}$, so they are both eigenvectors of $A$. Also, remember that they are unit vectors: so they are either equal to vectors in $Q$ or to $-1$ times these vectors.

Now to translate this in an answer to your question:

  • if $A$ is real symmetric and positive definite (i.e. all of its eigenvalues are strictly positive), $\Sigma$ is a diagonal matrix containing the eigenvalues, and $U=V$.

  • if $A$ is real symmetric and only semi-positive definite (i.e. all of its eigenvalues are positive but some of its eigenvalues can be zero), $\Sigma$ is a diagonal matrix containing the eigenvalues, but there is no guarantee that $U=V$. Indeed the part of $U$ and $V$ corresponding to the zero eigenvalues can be any orthonormal decomposition of the null space of $A$, with sign flips allowed independently on $U$ and $V$.

  • if $A$ is only real symmetric and not semi-positive definite (i.e. some of its eigenvalues can be negative), then $\Sigma$ is a diagonal matrix containing the absolute values of the eigenvalues. There are then two reasons for there being no guarantee that $U=V$. If there is a zero eigenvalue, then see previous bullet point. If there is a negative eigenvalue, then the sign "taken off" the eigenvalue in $\Lambda$ to construct the (positive by definition) $\Sigma$ to make it positive has to end up either on $U$ or $V$. For a concrete example consider a diagonal matrix with at least one negative element.

As noted by Arash you can replace in all the above statements the words "real symmetric" with "normal".

So to conclude a minimum condition for $U=V$ is to be normal and positive definite. Now is this necessary ? Is it proven that non-normal matrices can not have strictly positive eigenvalues ? This is the part I'm not sure about.