Finding the SVD of particular rank $1$, $4 \times 3$ matrix

abstract-algebralinear algebramatricesnumerical methodssvd

I am preparing for a Numerical Analysis exam, and one of the practice questions is the following:

Find both a reduced and full SVD of
$$A = \begin{bmatrix}
1 & 0 & 4\\
1 & 0 & 4\\
1 & 0 & 4\\
1 & 0 & 4\\
\end{bmatrix}$$

I would only have about 10-15 minutes to work on this problem. I have a partial solution below, but I am wondering if there is any faster way to do it, since my method seems long and unreliable (after finding a reduced SVD, I found the full SVD by trial and error).

My Solution: $A$ has rank $1$, so a reduced SVD is of the form
$$\begin{bmatrix}
1 & 0 & 4\\
1 & 0 & 4\\
1 & 0 & 4\\
1 & 0 & 4\\
\end{bmatrix} = \begin{bmatrix}u_1 \\u_2\\u_3\\u_4 \end{bmatrix} [\sigma] \begin{bmatrix}\overline{v_1} & \overline{v_2} & \overline{v_3} \end{bmatrix} = u \sigma v^*.$$

We know that $\{ u \}$ is a basis for $Col(A) = \langle \begin{bmatrix}1 & 1 & 1 & 1 \end{bmatrix}^T\rangle$, so $u$ is of the form $u = \begin{bmatrix}x & x & x & x \end{bmatrix}^T$. Since $||u||_2 = 1$, we must have $u = \pm \begin{bmatrix}\frac 12 & \frac 12 & \frac 12 & \frac 12 \end{bmatrix}^T$. We will chose the positive sign.

Similarly, $\{v \}$ is a basis for $Col(A^*) = \langle \begin{bmatrix} 1 & 0 & 4 \end{bmatrix}^T \rangle$ , and using $||v||_2 = 1$, we must have $v = \pm \begin{bmatrix} \frac{1}{\sqrt{17}} & 0 & \frac{4}{\sqrt{17}} \end{bmatrix}^T$. We will chose the positive sign. Finally, to find $\sigma$, look at the big SVD factorization of $A$ above. Equating entries $(1, 1)$ on both sides gives $ \frac 12 \sigma \frac {1}{\sqrt{17}}= 1$, which gives $\sigma = \frac {2}{\sqrt{17}}.$ Therefore a reduced SVD of $A$ is
$$A = \begin{bmatrix} 1/2 \\ 1/2\\\ 1/2\\ 1/2 \end{bmatrix} [2\sqrt{17}] \begin{bmatrix}\frac {1}{\sqrt{17}} & 0 & \frac {4}{\sqrt{17}} \end{bmatrix} $$

To find a full SVD, we just have to extend $u$ and $v$ to orthonormal matrices. I did this by basically guessing. An orthonormal extension of $u$ is

$$\frac 12 \begin{bmatrix} 1 & -1 & -1 & 1\\
1 & -1 & 1 & -1\\
1 & 1 & -1 & -1\\
1 & 1 & 1 & 1
\end{bmatrix}$$

and an orthonormal extension of $v$ is
$$\begin{bmatrix}
1/\sqrt{17} & 0 & 5/\sqrt{17}\\
0 & 1 & 0\\
5/\sqrt{17} & 0 & -1/\sqrt{17}
\end{bmatrix}$$

Therefore, a full SVD of $A$ is
$$A = \begin{bmatrix} 1 & -1 & -1 & 1\\
\frac 12 & -\frac 12 & \frac 12 & -\frac 12 \\
\frac 12 & \frac 12 & -\frac 12 & -\frac 12 \\
\frac 12 & \frac 12 & \frac 12 & \frac 12
\end{bmatrix}\begin{bmatrix}2 \sqrt {17} & 0 & 0\\
0 & 0 & 0\\
0 & 0 & 0\\
0 & 0 & 0 \end{bmatrix}
\begin{bmatrix}
1/\sqrt{17} & 0 & 5/\sqrt{17}\\
0 & 1 & 0\\
5/\sqrt{17} & 0 & -1/\sqrt{17}
\end{bmatrix}
$$

Best Answer

Your method seems fairly quick. The only thing that I would consider optimizing is the extension to an orthonormal basis.

First of all, an approach to be considered is writing something like

To find the full SVD, it suffices to extend $u$ and $v$ to orthonormal bases $(u,u_2,u_3,u_4)$ and $(v,v_2,v_3)$. With that, we can see that $A$ has full SVD $A = U \Sigma V^T$, with $$ U = \pmatrix{u & u_2 & u_3 & u_4}, \quad \Sigma = \pmatrix{2\sqrt{17}&0&0\\ 0&0&0\\ 0&0 &0\\ 0&0&0}, \quad V = \pmatrix{v_1&v_2&v_3} $$

and simply giving this as the "full answer". I would suspect that this would get you most of the partial credit.

That said, if you must compute an orthonormal basis, then the standard approach is the Gram Schmidt process. For instance, we can get the extension to $U$ by applying the procedure to the list $(u,e_1,e_2,e_3)$, where $e_1,\dots,e_4$ denote the standard basis vectors, i.e. the columns of the size-$4$ identity matrix.

However, we can indeed take advantage of the fact that $u$ is the vector of equal elements. In particular, we can always extend such a matrix into an orthonormal basis by using the columns of the DFT matrix. Alternatively, since the dimension of the space is a power of $2$, we could do the same by considering the columns of a scaled Hadamard matrix.

In general, we can also extend $v$ to such a basis in a nice fashion using the fact that $u \in \Bbb R^3$. In particular, for $v_2$ we can select any vector orthogonal to $v$, then take the cross-product $v \times v_2$ as a third vector. However, I prefer your method which simply takes advantage of the zero-entry in $v$.