How to prove this SVD inequality

linear algebramatricesmatrix decompositionsvd

Suppose that $A \in \mathbb C^{m \times n}$ has the SVD $U\Sigma V^\star$, where
\begin{align}
U &= \begin{bmatrix}u_1 & \cdots & u_m\end{bmatrix} \in \mathbb C^{m \times m} \\
\Sigma &= \begin{bmatrix} \text{diag}(\sigma_1,\dots,\sigma_r) & \mathbf 0 \\ \mathbf 0 & \mathbf 0\end{bmatrix} \in \mathbb C^{m \times n} \\
V &= \begin{bmatrix}v_1 & \cdots & v_n\end{bmatrix} \in \mathbb C^{n \times n}
\end{align}

where $r = \text{rank}(A)$. If $x \in \text{span}\{v_{k+1},\dots,v_n\}$, where $1 \leq k \leq n$, then how can I show that for every $x \in \mathbb C^n$,
$$ \| Ax \| \leq \sigma_{k+1}\|x\| $$
where $\| \cdot \|$ is the Euclidean norm?


My attempt so far is as follows
\begin{align}
\| Ax \| &\leq \|A\|\|x\| &&\quad \text{(Consistency of vector and matrix norms)} \\
&= \|U \Sigma V^\star\|\|x\| \\
&= \left\|\sum_{i=1}^r \sigma_i u_i v_i^\star\right\| \|x\| \\
&\leq \|x\| \cdot \sum_{i=1}^r \sigma_i \left\|u_i v_i^\star\right\| &&\quad \text{(Triangle inequality)} \\
&= \|x\| \cdot \sum_{i=1}^r \sigma_i &&\quad \text{($u_i v_i^\star$ is a projection, so $\left\|u_i v_i^\star\right\| = 1$)}
\end{align}

However, I am not sure how to proceed from here.

Best Answer

Let $r = \text{rank}(A)$. If $r \leq k$, and since $x \in \text{span}\{v_{k+1},\dots,v_n\}$, then $x$ is in the null space of $A$, such that $\|Ax\| = 0$. Since $\sigma_{k+1}$ and $\|x\|$ are both non-negative, then $\sigma_{k+1}\|x\| \geq 0 = \|Ax\|$.

Therefore, we consider the more interesting case when $r > k$. This means that $x \in \text{span}\{v_{k+1},\dots,v_{\min(r,n)}\}$. Furthermore, there exists $c_{k+1},\dots,c_r \in \mathbb C$ such that \begin{align} x &= c_{k+1}v_{k+1} + \cdots + c_rv_r \\ &= V_2 c \end{align} where \begin{align} V_2 &= \begin{bmatrix}v_{k+1} & \cdots & v_r\end{bmatrix} \in \mathbb C^{n \times (r - k)} \\ c &= \begin{bmatrix}c_{k+1} \\ \vdots \\ c_r\end{bmatrix} \in \mathbb C^{(r - k)} \end{align} Note that \begin{align} A &= U\Sigma V^\star \\ AV &= U \Sigma \\ A\begin{bmatrix} V_1 & V_2\end{bmatrix} &= \begin{bmatrix} U_1 & U_2\end{bmatrix} \begin{bmatrix} \Sigma_1 & \mathbf 0 \\ \mathbf 0 & \Sigma_2\end{bmatrix} \end{align} where $V_2$ is defined as above, and \begin{align} V_1 &= \begin{bmatrix}v_1 & \cdots & v_k\end{bmatrix} \in \mathbb C^{n \times k} \\ U_1 &= \begin{bmatrix}u_1 & \cdots & u_k\end{bmatrix} \in \mathbb C^{m \times k} \\ U_2 &= \begin{bmatrix}u_{k+1} & \cdots & u_r\end{bmatrix} \in \mathbb C^{m \times (r - k)} \\ \Sigma_1 &= \text{diag}(\sigma_1,\dots,\sigma_k) \in \mathbb C^{k \times k} \\ \Sigma_2 &= \text{diag}(\sigma_{k+1},\dots,\sigma_r) \in \mathbb C^{(r-k) \times (r-k)} \end{align} Therefore, \begin{align} AV_1 &= U_1\Sigma_1 \\ AV_2 &= U_2\Sigma_2 \end{align} Finally, \begin{align} \| Ax \|^2 &= (Ax)^\star Ax \\ &= (AV_2 c)^\star AV_2 c \\ &= (U_2\Sigma_2 c)^\star U_2 \Sigma_2 c \\ &= c^\star \Sigma_2^\star U_2^\star U_2 \Sigma_2 c \\ &= c^\star \Sigma_2^\star \Sigma_2 c \\ &= c^\star \Sigma_2^2 c \\ &= c_{k+1}^2 \sigma_{k+1}^2 + \cdots + c_{r}^2 \sigma_{r}^2 \\ &\leq c_{k+1}^2 \sigma_{k+1}^2 + \cdots + c_{r}^2 \sigma_{k+1}^2 \\ &= \sigma_{k+1}^2(c_{k+1}^2 + \cdots + c_{r}^2) \end{align} Therefore, $$\| Ax \|^2 \leq \sigma_{k+1}^2(c_{k+1}^2 + \cdots + c_{r}^2)$$ Applying the square root to both sides and taking the positive square root (note that the square root function is monotone increasing for positive arguments), we get \begin{align} \| Ax \| &\leq \sqrt{\sigma_{k+1}^2(c_{k+1}^2 + \cdots + c_{r}^2)} \\ &= \sqrt{\sigma_{k+1}^2}\sqrt{c_{k+1}^2 + \cdots + c_{r}^2} \\ &= \sigma_{k+1}\| x \| \end{align}

Related Question