Linear Algebra – How to Compute the SVD of 2×2 Matrices

linear algebramatricessvd

What's an efficient algorithm to get the SVD of $2\times 2$ matrices?

I've found papers about doing SVD on $2\times 2$ triangular matrices, and I've seen the analytic formula to get the singular values of a $2\times 2$ matrix. But how to use either of these to get the SVD of an arbitrary $2\times 2$ matrix?

Are the general algorithms built on these, or are these just some special cases?

Best Answer

There's a ridiculously easy method to apply the method you already know, starting from your matrix $\mathbf A$. Consider the QR decomposition

$$\mathbf A=\mathbf Q\mathbf R$$

where $\mathbf Q$ is orthogonal and $\mathbf R$ is upper triangular. You say you already know how to take the SVD of a triangular matrix:

$$\mathbf R=\mathbf W\mathbf \Sigma\mathbf V^\top$$

where both $\mathbf W$ and $\mathbf V^\top$ are orthogonal and $\mathbf \Sigma$ is diagonal. You know (or at least are supposed to know) that the product of two orthogonal matrices is again an orthogonal matrix, so letting $\mathbf U=\mathbf Q\mathbf W$, you should be able to obtain your desired singular value decomposition $\mathbf U\mathbf \Sigma\mathbf V^\top$.

In general, preprocessing your matrix with QR decomposition makes for an easier SVD computation. For a general $m\times n$ matrix, one can do a "thin QR decomposition" (with column pivoting if needed) where $\mathbf Q$ has the same dimensions as the original matrix, and $\mathbf R$ is square and triangular. One can then take the SVD of $\mathbf R$ and then multiply out the orthogonal matrices (and permutation matrices if you did pivoting; remember that permutation matrices are also orthogonal!) to then obtain the singular value decomposition of your original matrix.


FWIW, in the $2\times 2$ case, you're charmed, since you can use an appropriately constructed Givens rotation matrix for both the QR and SVD stages. The details should be in e.g. Golub and Van Loan's book.