SVD of A with orthogonal columns

linear algebrasvd

I am trying to solve the following linear algebra problem:

Suppose that A $\in \mathbb{R}^{m\times n}$ has orthogonal columns $w_1,…,w_n$ where $\| w_i\|_2 = \alpha_i > 0$ Find the matrices $U, \Sigma, V$ of the SVD of A.

What I could do was the following:

Since $\Sigma$ is a $m \times n$ matrix with $0$ everywhere except in the first $rank(A)$ places of the diagonal, where $(\Sigma)_i = \sqrt{\lambda_i}$ with $\lambda_i$
eigenvalue of $A^tA$, calculating $A^tA$ equals

$$ A^tA = D$$

where $D$ is a diagonal matrix with $d_{ii} = \alpha_i$, so in its $n$ diagonal entries $\Sigma$ has the norm of the column vectors of $A$.

This was all that I could do, I don't know how to calculate $U$ and $V$ so I would really appreciate some help to calculate them. Thanks in advance!

Best Answer

Here is one way to do it. First, notice that because $w_1, ..., w_n$ are orthogonal, they are linearly independent vectors in $\mathbb{R}^m$, which implies $m \geq n$. Thus, we can extend $(w_1 / \alpha_1, ..., w_n / \alpha_n)$ to an orthonormal basis of $\mathbb{R}^m$, say $(w_1 / \alpha_1, ..., w_n / \alpha_n, y_1, ..., y_{m - n})$.

Let $U \in \mathbb{R}^{m \times m}$ be the matrix with these vectors as its columns: $U = \begin{pmatrix} \vert & & \vert & \vert & & \vert \\ w_1 / \alpha_1 & \dots & w_n / \alpha_n & y_1 & \dots & y_{m - n} \\ \vert & & \vert & \vert & & \vert \end{pmatrix}$.

Since the basis is orthonormal, $U$ is orthogonal.

Set $\Sigma \in \mathbb{R}^{m \times n}$ as $\Sigma = \begin{pmatrix} \alpha_1 & \dots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \dots & \alpha_n \\ 0 & \dots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \dots & 0 \end{pmatrix}$.

You can check that $A = U\Sigma$, so if $V = I$ (the $n\times n$ identity matrix) then $(U, \Sigma, V)$ is an SVD for A.

Related Question