Summation notation of singular value decomposition

matricesnotationsummationsvd

In Boyd's optimization book, he explains the singular value decomposition of a matrix of rank $r$ can be written like this:
$$A \in R^{mxn} = U\text{diag}(\sigma)V^T = \sum_{i=1}^r \sigma_i u_i v_i^T\\
\text{where } u_i \in R^m \text{ are left singular vectors}\\
v_i \in R^n \text{ are right singular vectors}
$$

I don't understand this, because if $v_i$ are right (row) vectors, then shouldn't $v_i^T$ be left (column) vectors?

So doesn't this notation say we are multiplying a vector in $R^{mx1}$ with a vector in $R^{nx1}$ ?

I think the $\sigma_i$'s are scalars.

Best Answer

$U \in \mathbb{R}^{m \times r}, V \in \mathbb{R}^{n \times r}$, each $v_i$ is a column vector and $v_i^T$ are row vectors.. \begin{align} U \Sigma V^T = \begin{bmatrix} u_1 \ldots, u_r\end{bmatrix} \begin{bmatrix} \sigma_1 v_1^T \\ \vdots \\ \sigma _rv_r^T\end{bmatrix} = \sum_{i=1}^r \sigma_i u_i v_i^T \end{align}

Related Question