Compute the $3$rd column of the $U$ matrix in SVD for $A=\left[\begin{smallmatrix}1&2\\2&2\\2&1\end{smallmatrix}\right]=U\Sigma V$

svdvectors

Find the SVD of $$A= \begin{bmatrix}
1 & 2\\
2 & 2 \\
2 & 1
\end{bmatrix}$$
which has the form A= $U \Sigma V$

1.

$A^TA=\begin{bmatrix}
9 & 8\\
8 & 9 \\
\end{bmatrix}$

  1. The eigenvalues are $\lambda_1=17, \lambda_2=1$
    The corresponding eigenvectors are

$ \vec{v_1}=\begin{bmatrix}
1 \\
1
\end{bmatrix}$

and $\vec{v_2}=\begin{bmatrix}1 \\ -1 \end{bmatrix}$

After normalization, $V= \begin{bmatrix} \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} \end{bmatrix}$

3.$\sigma_1=\sqrt{17}$, $\sigma_2=1$ so $\Sigma= \begin{bmatrix} \sqrt{17} & 0 \\ 0 & 1 \\ 0 & 0 \end{bmatrix}$

  1. The first two columns of U can be represented as

$u_1=\frac{1}{\sqrt{17}}Av_1=\frac{1}{\sqrt{17}} \frac{1}{\sqrt{2}} \begin{bmatrix}
1 & 2\\
2 & 2 \\
2 & 1
\end{bmatrix} \begin{bmatrix}
1 \\
1 \\
\end{bmatrix}=\frac{1}{34} \begin{bmatrix}
3 \\
4 \\
3
\end{bmatrix} .$

$u_2 = \frac{1}{1} Av_2= \frac{1}{\sqrt{2}} \begin{bmatrix}
1 & 2 \\
2 & 2 \\
2& 1
\end{bmatrix} \begin{bmatrix}
1 \\
-1 \\
\end{bmatrix} = \frac{1}{\sqrt{2}} \begin{bmatrix}
-1 \\
0 \\
1
\end{bmatrix} $

$U=
\begin{bmatrix}
\frac{3}{\sqrt{34}} & \frac{-1}{\sqrt{2}} & u_3(1) \\
\frac{4}{\sqrt{34}} & 0 & u_3(2) \\
\frac{3}{\sqrt{34}} & \frac{1}{\sqrt{2}} & u_3(3)
\end{bmatrix}$

In order to determine $u_3(i)$, $i \in \{1,2,3\}$ need to satisfy, $u_j^*u_3 = \delta_{j3}, j=1,2,3$

Such a column vector $u_3$ is $\frac{1}{\sqrt{17}} \begin{bmatrix}
2\\
-3\\
2
\end{bmatrix}$

How was $u_3$ calculated? What exactly is $\delta_{j3}$?

References

Fass, 2006 p. 30
http://www.math.iit.edu/~fass/477577_Chapter_2.pdf

Best Answer

The thing is that the U matrix is not always unqiue. In your example there are two possible $U$ matrices, because the condition $u^{*}_j u_3 = \delta_{j3}$ can be satisfied with 2 vectors. You can find a vector which is orthogonal to the plane formed by $u_1$ and $u_2$, after that you can normalize that vector in this is the first solution $x$. The second solution would be $-x$. In fact if you tried to decompose a matrix $B \in \mathbb{R}^{3 \times 1}$, you'd have an infinite amount of $U$ matrices, because you'd need to find two vectors $u_2$, $u_3$ orthogonal to $u_1$, each other and having unit norm. That's a very weak condition. That's why people usually compute a compact SVD of low rank matrices, they kind of discard unnecessary parts of $U$ (which aren't uniquely identifiable and not don't help to restore the original matrix).