[Math] Singular Value Decomposition in terms of Change of Basis

linear algebra

I do see how SVD can be understood in terms of rotating. But it is hard for me to understand SVD in terms of change of basis.

So, let me start from Diagonalization to clarify my question. When $B=\{v_1, \dots, v_n\}$ where $v_i$ are eigenvectors, $A=PDP^T$. Then $PDP^Tx$ means that $x$ becomes $[x]_B$ (that is, $P^Tx$) and scaling operates on $[x]_B$ (that is, D) and then the result is transformed into the original coordinate (that is, P).

On the other hand, understanding SVD in terms of change of basis is really hard for me. Let me say that $A=U\Sigma V^T$. Then, given $U\Sigma V^Tx$, $V^T$ transforms x into the coordinate system consisting of column vectors of V. Scaling operates in that coordinate system ($\Sigma$). But after that, how can one transform the vector under coordinate system governed by V into the original coordinate system? $U$ is different from $V$!

Thank you!

Best Answer

I posted this to a previous related question:

Let's play with this a bit.

Let $ A \in \mathbb{R}^{m \times n} $. Then we know it has a singular value decomposition $ A = U \Sigma V^T $ where $ U \in \mathbb{R}^{m \times m} $ and $ V \in \mathbb{R}^{n \times n} $ are orthogonal ($ U^T U = U U^T = I $ and $ V^T V = V V^T = I $) and $ \Sigma $ has nonnegative diagonal elements, ordered from largest to smallest.

Now consider $ y = A x $.

$ x = I x = V V^T x $. So, $ V^T x $ gives you the coefficients of $ x $ when using the columns of $ V $ as the basis.

$ y = I y = U U^T y $. So, $ U^T y $ gives you the coefficients of $ y $ when using the columns of $ U $ as the basis.

So, $ U U^T y = A V V^T x $. Equivalently, $ (U^T y) = (U^T A V) (V^T x) = \Sigma (V^T x) $.
So, if you view $ x $ in the right orthonormal basis (given by the columns of $ V $) and $ y $ in the right orthonormal basis (given by the columns of $ U $), then the (non-square) matrix that relates them becomes diagonal.

A truly beautiful result.