[Math] when singular value decomposition is equal to eigenvalue decomposition

eigenvalues-eigenvectorssingular values

I've read in my textbook that the right singular vector $v_i$ is actually the eigenvector of $A^TA$ with eigenvalue $\sigma_i^2$, and the left one $u_i$ is the eigenvector of $AA^T$.So I guess if A is symmetric, I'll get $A = VDV^T$, but I found a negative instance with
$$\left[
\begin{matrix}
1&3\\
3&1
\end{matrix}\right]
$$

$$
\left[
\begin{matrix}
1&3\\
3&1
\end{matrix}\right] =
\left[
\begin{matrix}
-0.7071&-0.7071\\
-0.7071&0.7071
\end{matrix}\right]
\left[
\begin{matrix}
4&0\\
0&2
\end{matrix}\right]
\left[
\begin{matrix}
-0.7071&0.7071\\
-0.7071&-0.7071
\end{matrix}\right]
$$

I got this result by Matlab. But it doesn't have $U = V$, $i.e. \space S=VDV^T$
I have no idea about why it happens. Or here the matrix $A$ must be positive semi definite? Or the forms of SVD have many versions? And notice that the matrix I am discussing is special here since their singular values are distinct. I think maybe I lost some condition to make the statement true, please help me find it out!

Best Answer

If $A$ is symmetric with an orthonormal eigendecomposition $A=U\Lambda U^T$, then an SVD of $A$ is $A=U\Sigma V^T,$ where $\sigma_j=|\lambda_j|$ and $v_j=\text{sign}(\lambda_j) u_j$, where $\text{sign}(0)=1.$ You can verify that this satisfies the definition of the SVD given as follows:

Let $A$ be $m\times n$, with $m\geq n.$ Then, we can write $A=U\Sigma V^T,$ where $U$ is $m\times n$ and satisfies $U^TU=I,$ $V$ is $n\times n$ and satisfies $V^TV=VV^T=I$ and $\Sigma=\text{diag} (\sigma_j)$, where $\sigma_1\geq\cdots\geq\sigma_n\geq 0.$

Note that this is, actually, not a definition, but a theorem (that such a decomposition exists).