NOTE that while U == V, the matricies U and V are not symmetric. The definition of the SVD: A = USV' (where the prime indicates the Hermitian (conjugate transpose) of the matrix) - so this is probably your mistake - you probably forgot to apply the transpose - or your math language may have already supplied it for you.
In a lot of the languages with built-in SVD (ie Matlab, Python...) you need to be careful that the SVD function might return U,S,V' (the Hermitian of V)
for example in python you would have
U,S,VH = svd(A)
This produces the same U,S,V as you have above - except that it gives the Hermitian of V
Putting it all back together... (in python 'dot' is used for matrix multiplication)
B = dot(U,dot(diag(S),VH))
yielding abs(B-A).max() = 9.992e-16, somewhere around machine epsilon
Your question provides a forum to clear common confusion about the singular value decomposition:
$$
\mathbf{A} =
\mathbf{U} \, \Sigma \, \mathbf{V}^{*}
\tag{1}
$$
The specific example will use
$$
\mathbf{A} =
\left[
\begin{array}{rcc}
3 & 1 & 2 \\
-2 & 1 & 3 \\
\end{array}
\right]
\tag{2}
$$
# Computing the SVD
Consider the general matrix $\mathbf{A} = \mathbb{C}^{m \times n}_{\rho}$.
The steps for computing the SVD are
- Resolve the eigensystem for a product matrix $\mathbf{W}$
- Compute remaining component matrix from $(1)$.
Recipes
The choice is whether to first resolve the row space or the column space. The table below compares the two parallel paths. The notation
$$
\lambda \left( \mathbf{W} \right)
$$
represents the eigenvalue spectrum for the matrix $\mathbf{W}$,
while
$$
\tilde{\lambda} \left( \mathbf{W} \right)
$$
represents the ordered, eigenvalue spectrum with the $0$ elements removed. There will be $\rho$ nonzero eigenvalues.
$$
\begin{array}{lll}
%
\text{Operation} &
\text{Row space first} & \text{Column space first} \\\hline
%
\text{1. Construct product matrix} &
\mathbf{W} = \mathbf{A}^{*} \mathbf{A} &
\mathbf{W} = \mathbf{A} \, \mathbf{A}^{*} \\
%
\text{2. Solve for eigenvalues} &
\sigma = \sqrt{\tilde{\lambda} \left( \mathbf{W} \right)} &
\sigma = \sqrt{\tilde{\lambda} \left( \mathbf{W} \right)} \\
%
\color{blue}{\text{3. Solve for eigenvectors }} w_{k},\ k=1, \rho &
\left( \mathbf{W} - \lambda_{k} \mathbf{I}_{n} \right) w_{k} = \mathbf{0} &
\left( \mathbf{W} - \lambda_{k} \mathbf{I}_{m} \right) w_{k} = \mathbf{0} \\
%
\text{4. Assemble domain matrix} &
\mathbf{V}_{k} = \frac{w_{k}}{\lVert w_{k} \rVert_{2}} &
\mathbf{U}_{k} = \frac{w_{k}}{\lVert w_{k} \rVert_{2}} &
\\
%
\text{5. Compute complementary domain matrix} &
\mathbf{U}_{k} = \sigma_{k}^{-1} \mathbf{A} \mathbf{V}_{k} &
\mathbf{V}_{k} = \sigma_{k}^{-1} \mathbf{A}^{*} \mathbf{U}_{k} &
\\
%
\end{array}
$$
The step highlighted in $\color{blue}{blue}$ is the step where we must pick a sign (or phase). Ambiguity enters here.
The following example computes the SVD by 1) resolving the row space first and then 2) resolving the column space first. The decompositions are expressed in terms of complex phase factors, and so represent the most general case of sign conventions.
Notice that life is easier when we work with the smaller of the two product matrices $\mathbf{A}\mathbf{A}^{*}$ and $\mathbf{A}^{*}\mathbf{A}$.
Example I: Row space first
Construct product matrix
$$
\mathbf{W} = \mathbf{A}^{*} \, \mathbf{A} =
\left[
\begin{array}{ccc}
13 & 1 & 0 \\
1 & 2 & 5 \\
0 & 5 & 13 \\
\end{array}
\right]
$$
Solve for eigenvalues
The characteristic polynomial is computed using
$$
p(\lambda) = \det \left( \mathbf{W} - \lambda \mathbf{I}_{3}\right)
=
\det
\left[
\begin{array}{ccc}
13-\lambda & 1 & 0 \\
1 & 2-\lambda & 5 \\
0 & 5 & 13-\lambda \\
\end{array}
\right]
$$
Solve for the determinant by computing the minors:
$$
\left|
\begin{array}{ccc}
13-\lambda & 1 & 0 \\
1 & 2-\lambda & 5 \\
0 & 5 & 13-\lambda \\
\end{array}
\right|
=
%
\boxed{13-\lambda}
\left|
\begin{array}{cc}
2-\lambda & 5 \\
5 & 13-\lambda \\
\end{array}
\right|
%
-\boxed{1}
\left|
\begin{array}{cc}
1 & 5 \\
0 & 13-\lambda \\
\end{array}
\right|
%
+\boxed{0}
\left|
\begin{array}{cc}
1 & 2-\lambda \\
0 & 5 \\
\end{array}
\right|
$$
The characteristic polynomial is
$$
p \left( \lambda \right)
= -\lambda ^3+28 \lambda ^2-195 \lambda
= -\lambda \left( \lambda - 13 \right) \left( \lambda - 15 \right)
$$
The eigenvalue spectrum is
$$
\lambda\left( \mathbf{W} \right) = \left\{ 0, 13, 15
\right\}
$$
The truncated, ordered eigenvalue spectrum is
$$
\tilde{\lambda} \left( \mathbf{W} \right) = \left\{ 15, 13
\right\}
$$
is the foundation for the singular values:
$$
\sigma = \sqrt{\tilde{\lambda}}
$$
The matrix of singular values,
$$
\mathbf{S} =
\left[
\begin{array}{cc}
\sqrt{15} & 0 \\
0 & \sqrt{13} \\
\end{array}
\right],
$$
is embedded in the sabot matrix:
$$
\Sigma =
%
\left[
\begin{array}{cc}
\mathbf{S} & \mathbf{0}
\end{array}
\right]
%
=
%
\left[
\begin{array}{cc|c}
\sqrt{15} & 0 & 0 \\
0 & \sqrt{13} & 0 \\
\end{array}
\right]
%
$$
- $\color{blue}{\text{Solve for eigenvectors}}$
First eigenvector
$$
\begin{align}
\left( \mathbf{W} - \lambda_{1} \mathbf{I}_{3} \right) w_{1} &= \mathbf{0} \\
%
\left[
\begin{array}{rrr}
-2 & 1 & 0 \\
1 & -13 & 5 \\
0 & 5 & -2 \\
\end{array}
\right]
%
\left[
\begin{array}{rr}
w_{x} \\
w_{y} \\
w_{z} \\
\end{array}
\right]
%
&= %
\left[
\begin{array}{c}
0 \\
0 \\
0 \\
\end{array}
\right]
%
\end{align}
$$
The general solution is
$$
\left[
\begin{array}{c}
w_{x} \\
w_{y} \\
w_{z} \\
\end{array}
\right]
=
e^{i \theta_{1}}
\left[
\begin{array}{c}
1 \\
2 \\
5 \\
\end{array}
\right]
$$
with $0 \le \theta_{k} \le 2\pi$. This is the general phase angle.
The normalized vector is the first column vector in the domain matrix
$$
\mathbf{V}_{1} = \frac{e^{i \theta_{1}}}{\sqrt{30}}
\left[
\begin{array}{c}
1 \\
2 \\
5 \\
\end{array}
\right]
$$
Second eigenvector
$$
\begin{align}
\left( \mathbf{W} - \lambda_{2} \mathbf{I}_{3} \right) w_{2} &= \mathbf{0} \\
%
\left[
\begin{array}{crr}
0 & 1 & 0 \\
1 & -11 & 5 \\
0 & 5 & 0 \\
\end{array}
\right]
%
\left[
\begin{array}{rr}
w_{x} \\
w_{y} \\
w_{z} \\
\end{array}
\right]
%
&= %
\left[
\begin{array}{c}
0 \\
0 \\
0 \\
\end{array}
\right]
%
\end{align}
$$
The general solution is
$$
\left[
\begin{array}{c}
w_{x} \\
w_{y} \\
w_{z} \\
\end{array}
\right]
=
e^{i \theta_{2}}
\left[
\begin{array}{r}
-5 \\
0 \\
1 \\
\end{array}
\right]
$$
For the purposes of the thin SVD, we are done as we have found the $\rho=2$ eigenvectors. However, we can compute the null space term by solving for the eigenvector of the $0$ eigenvalue.
Third eigenvector
$$
\begin{align}
\left( \mathbf{W} - \lambda_{3} \mathbf{I}_{3} \right) w_{3} &= \mathbf{0} \\
%
\left[
\begin{array}{ccc}
13 & 1 & 0 \\
1 & 2 & 5 \\
0 & 5 & 13 \\
\end{array}
\right]
%
\left[
\begin{array}{rr}
w_{x} \\
w_{y} \\
w_{z} \\
\end{array}
\right]
%
&= %
\left[
\begin{array}{c}
0 \\
0 \\
0 \\
\end{array}
\right]
%
\end{align}
$$
The general solution is
$$
\left[
\begin{array}{c}
w_{x} \\
w_{y} \\
w_{z} \\
\end{array}
\right]
=
e^{i \theta_{3}}
\left[
\begin{array}{r}
1 \\
-13 \\
5 \\
\end{array}
\right]
$$
This normalized vector is the third and final column vector in the domain matrix
$$
\mathbf{V}_{3} = \frac{e^{i \theta_{3}}}{\sqrt{195}}
\color{gray}{\left[
\begin{array}{r}
1 \\
-13 \\
5 \\
\end{array}
\right]}
$$
Null space vectors are shaded in gray.
- Assemble domain matrix
$$
\mathbf{V} =
\left[
\begin{array}{cc}
% c1
\frac{e^{i \theta_{1}}}{\sqrt{30}}
\left[
\begin{array}{c}
1 \\
2 \\
5 \\
\end{array}
\right]
% c2
\frac{e^{i \theta_{2}}}{\sqrt{26}}
\left[
\begin{array}{r}
-5 \\
0 \\
1 \\
\end{array}
\right]
% c3
\frac{e^{i \theta_{3}}}{\sqrt{195}}
\color{gray}{\left[
\begin{array}{r}
1 \\
-13 \\
5 \\
\end{array}
\right]}
%
\end{array}
\right]
$$
- Compute columns of $\mathbf{U}$
$$
\mathbf{U}_{1} = \sigma^{-1}_{1} \mathbf{A} \mathbf{V}^{*}_{1} =
\frac{e^{i \theta_{1}}}
{\sqrt{2}}
\left[
\begin{array}{r}
1 \\
1 \\
\end{array}
\right]
$$
$$
\mathbf{U}_{2} = \sigma^{-2}_{1} \mathbf{A} \mathbf{V}^{*}_{2} =
\frac{e^{i \theta_{2}}}
{\sqrt{2}}
\left[
\begin{array}{r}
-1 \\
1 \\
\end{array}
\right]
$$
We're done. The singular value decomposition is
$$
\begin{align}
\mathbf{A} &= \mathbf{U} \, \Sigma \, \mathbf{V}^{*} \\
\\ & =
% U
\underbrace{
\frac{1}{\sqrt{2}}
\left[
\begin{array}{cc}
% c1
e^{i \theta_{1}}
\left[
\begin{array}{c}
1\\
1\\
\end{array}
\right]
% c2
e^{i \theta_{2}}
\left[
\begin{array}{r}
-1\\
1\\
\end{array}
\right]
%
\end{array}
\right]}_{\mathbf{U}}
% Sigma
\underbrace{
\left[
\begin{array}{cc|c}
\sqrt{15} & 0 & 0 \\
0 & \sqrt{13} & 0 \\
\end{array}
\right]}_{\Sigma}
% V*
\underbrace{
\left[
\begin{array}{cc}
% c1
\frac{e^{i \theta_{1}}}
{\sqrt{30}}
\left[
\begin{array}{r}
1 \\
2 \\
5 \\
\end{array}
\right]
% c2
\frac{e^{i \theta_{2}}}
{\sqrt{26}}
\left[
\begin{array}{r}
-5 \\
0 \\
1 \\
\end{array}
\right]
\color{gray}{
\frac{e^{i \theta_{3}}}{\sqrt{195}}
\left[
\begin{array}{r}
1 \\
-13 \\
5 \\
\end{array}
\right]}
%
\end{array}
\right]
^{*}
}_{\mathbf{V}}
% A
\\[3pt] & =
\left[
\begin{array}{rcc}
3 & 1 & 2 \\
-2 & 1 & 3 \\
\end{array}
\right]
%
\end{align}
$$
## Example II: Column space first
Construct product matrix
$$
\mathbf{W} = \mathbf{A} \, \mathbf{A}^{*} =
\left[
\begin{array}{cc}
14 & 1 \\
1 & 14 \\
\end{array}
\right]
$$
Solve for eigenvalues
The eigenvalues are the roots of the characteristic polynomial
$$
p \left( \lambda \right) = \lambda^{2} - \lambda \, \text{tr }\mathbf{W} + \det \mathbf{W}
$$
The trace and determinant are
$$
\text{tr }\mathbf{W} = 28, \qquad \det \mathbf{W} = 195
$$
The eigenvalue spectrum is
$$
\lambda \left( \mathbf{W} \right) = \tilde{\lambda} \left( \mathbf{W} \right) = \left\{ 15, 13
\right\},
$$
already in a form suitable to compute singular values:
$$
\sigma = \sqrt{\tilde{\lambda}}
$$
The matrix of singular values is
$$
\mathbf{S} =
\left[
\begin{array}{cc}
\sqrt{15} & 0 \\
0 & \sqrt{13} \\
\end{array}
\right],
$$
and is embedded in the sabot matrix like so:
$$
\Sigma =
%
\left[
\begin{array}{cc}
\mathbf{S} & \mathbf{0}
\end{array}
\right]
%
\left[
\begin{array}{cc|c}
\sqrt{15} & 0 & 0 \\
0 & \sqrt{13} & 0 \\
\end{array}
\right]
%
$$
- $\color{blue}{\text{Solve for eigenvectors}}$
First eigenvector
$$
\begin{align}
\left( \mathbf{W} - \lambda_{1} \mathbf{I}_{2} \right) w_{1} &= \mathbf{0} \\
%
\left[
\begin{array}{rr}
-1 & 1 \\
1 & -1 \\
\end{array}
\right]
%
\left[
\begin{array}{rr}
w_{x} \\
w_{y} \\
\end{array}
\right]
%
&= %
\left[
\begin{array}{rr}
0 \\
0 \\
\end{array}
\right]
%
\end{align}
$$
The general solution is
$$
\left[
\begin{array}{rr}
w_{x} \\
w_{y} \\
\end{array}
\right]
=
e^{i \phi_{1}}
\left[
\begin{array}{rr}
1 \\
1 \\
\end{array}
\right]
$$
with $0 \le \phi_{k} \le 2\pi$. This is the juncture where one picks a sign; instead, we leave the general phase.
The normalized vector is the first column vector in the domain matrix
$$
\mathbf{U}_{1} = \frac{e^{i \phi_{1}}}{\sqrt{2}}
\left[
\begin{array}{rr}
1 \\
1 \\
\end{array}
\right]
$$
Second eigenvector
$$
\begin{align}
\left( \mathbf{W} - \lambda_{2} \mathbf{I}_{2} \right) w_{1} &= \mathbf{0} \\
%
\left[
\begin{array}{cc}
1 & 1 \\
1 & 1 \\
\end{array}
\right]
%
\left[
\begin{array}{rr}
w_{x} \\
w_{y} \\
\end{array}
\right]
%
&= %
\left[
\begin{array}{rr}
0 \\
0 \\
\end{array}
\right]
%
\end{align}
$$
The general solution is
$$
\left[
\begin{array}{rr}
w_{x} \\
w_{y} \\
\end{array}
\right]
=
e^{i \phi_{2}}
\left[
\begin{array}{rr}
-1 \\
1 \\
\end{array}
\right]
$$
The minus sign could go in the top entry as shown or the bottom entry.
The normalized vector is the second column vector in the domain matrix
$$
\mathbf{U}_{2} = \frac{e^{i \phi_{2}}}{\sqrt{2}}
\left[
\begin{array}{r}
-1 \\
1 \\
\end{array}
\right]
$$
- Assemble domain matrix
$$
\mathbf{U} =
\frac{1}{\sqrt{2}}
\left[
\begin{array}{cc}
% c1
e^{i \phi_{1}}
\left[
\begin{array}{c}
1\\
1\\
\end{array}
\right]
% c2
e^{i \phi_{2}}
\left[
\begin{array}{r}
-1\\
1\\
\end{array}
\right]
%
\end{array}
\right]
$$
- Compute columns of $\mathbf{V}$
$$
\mathbf{V}_{1} = \sigma^{-1}_{1} \mathbf{A}^{*} \mathbf{U}_{1} =
\frac{e^{i \phi_{1}}}
{\sqrt{30}}
\left[
\begin{array}{r}
1 \\
2 \\
5 \\
\end{array}
\right]
$$
$$
\mathbf{V}_{2} = \sigma^{-1}_{2} \mathbf{A}^{*} \mathbf{U}_{2} =
\frac{e^{i \phi_{2}}}
{\sqrt{26}}
\left[
\begin{array}{r}
-5 \\
0 \\
1 \\
\end{array}
\right]
$$
The third and final column for $\mathbf{V}$ is in the null space $\mathcal{N}(\mathbf{A})$. One way to compute this vector is to use the cross product
$$
\left[
\begin{array}{c}
1 \\
2 \\
5 \\
\end{array}
\right]
\times
\left[
\begin{array}{r}
-5 \\
0 \\
1 \\
\end{array}
\right]
= 2\left[
\begin{array}{r}
1 \\
-13 \\
5 \\
\end{array}
\right]
$$
The third and final vector is the normalized version
$$
\color{gray}{\mathbf{V}_{3}} =
\color{gray}{
\frac{1}{\sqrt{195}}
\left[
\begin{array}{r}
1 \\
-13 \\
5 \\
\end{array}
\right]}
$$
The lighter shading reminds that this vector is lives in a null space.
The singular value decomposition is
$$
\begin{align}
\mathbf{A} &=
\mathbf{U} \, \Sigma \, \mathbf{V}^{*}
\\ & =
% U
\underbrace{
\frac{1}{\sqrt{2}}
\left[
\begin{array}{cc}
% c1
e^{i \phi_{1}}
\left[
\begin{array}{c}
1\\
1\\
\end{array}
\right]
% c2
e^{i \phi_{2}}
\left[
\begin{array}{r}
-1\\
1\\
\end{array}
\right]
%
\end{array}
\right]}_{\mathbf{U}}
% Sigma
\underbrace{
\left[
\begin{array}{cc|c}
\sqrt{15} & 0 & 0 \\
0 & \sqrt{13} & 0 \\
\end{array}
\right]}_{\Sigma}
% V*
\underbrace{
\left[
\begin{array}{cc}
% c1
\frac{e^{i \phi_{1}}}
{\sqrt{30}}
\left[
\begin{array}{r}
1 \\
2 \\
5 \\
\end{array}
\right]
% c2
\frac{e^{i \phi_{2}}}
{\sqrt{26}}
\left[
\begin{array}{r}
-5 \\
0 \\
1 \\
\end{array}
\right]
\color{gray}{
\frac{1}{\sqrt{195}}
\left[
\begin{array}{r}
1 \\
-13 \\
5 \\
\end{array}
\right]}
%
\end{array}
\right]
^{*}
}_{\mathbf{V}}
% A
\\[3pt] & =
\left[
\begin{array}{rcc}
3 & 1 & 2 \\
-2 & 1 & 3 \\
\end{array}
\right]
\end{align}
\tag{1}
$$
## Error in post
The null space vector is unique up to the general phase.
The vector cited in the question does not work.
For example, when $a=1$
$$
\mathbf{U} \, \Sigma \,
\left[
\begin{array}{cc}
% c1
\frac{e^{i \phi_{1}}}
{\sqrt{30}}
\left[
\begin{array}{r}
1 \\
2 \\
5 \\
\end{array}
\right]
% c2
\frac{e^{i \phi_{2}}}
{\sqrt{26}}
\left[
\begin{array}{r}
-5 \\
0 \\
1 \\
\end{array}
\right]
\color{gray}{
\frac{1}{\sqrt{611}}
\left[
\begin{array}{r}
21 \\
-13 \\
1 \\
\end{array}
\right]}
%
\end{array}
\right]^{*}
=
\left[
\begin{array}{rcc}
3 & 1 & 21 \sqrt{\frac{15}{1222}}+\frac{13}{\sqrt{94}} \\
-2 & 1 & 21 \sqrt{\frac{15}{1222}}-\frac{13}{\sqrt{94}} \\
\end{array}
\right]
\ne \mathbf{A}
$$
Best Answer
One does not need to separately compute the eigenvectors of $A A^T$ and also $A^T A$ in order to get an SVD (even in hand calculations). Given an orthonormal eigenbasis for $A^T A$ (resp. $A A^T$), this gives you the right (resp. left) singular vectors. The eigenvalues give you the singular values upon taking square roots. The defining equation for the SVD tells you
$$Av_i=\sigma_i u_i \\ A^T u_i=\sigma_i v_i.$$
This just follows by matrix multiplication:
$$A v_i=U \Sigma V^T v_i = U \Sigma e_i = U \sigma_i e_i = \sigma_i u_i.$$
As an aside, the above pair of equations characterize the SVD through a symmetric eigenproblem not involving $A A^T$ or $A^T A$, which is a crucial step toward developing a numerically stable algorithm for the SVD.
Anyway, if $\sigma_i \neq 0$, to get $u_i$ (resp. $v_i$) it is enough to apply $A$ (resp. $A^T$) to $v_i$ (resp. $u_i$) and divide by $\sigma_i$. If $\sigma_i$ is zero and you want a full SVD then you have some arbitrary choices to make in order to "fill out" $V$ and/or $U$ (specifically you must select any orthonormal basis for the null space of $A$ and/or $A^T$ and add it to the bona fide singular vectors).