Bound of $\lVert \mathbf{Ax} \rVert$ with $\mathbf{AA}^\top=\mathbf{I}$

linear algebramatricessvdupper-lower-boundsvectors

The original question (informal):

Consider the matrix $\mathbf{A}\in\mathbb{R}^{M\times N}~(M\le N)$ such that $\mathbf{AA}^\top=\mathbf{I}_M$. I want to determine whether the value of $\lVert \mathbf{Ax} \rVert$ is bounded for any $\mathbf{x}\in\mathbb{R}^N$, and how to prove it.


Some of my efforts:

I may guess that there holds that $0\le \lVert \mathbf{Ax} \rVert \le \lVert \mathbf{x} \rVert$. My intuition is that, the vector $\mathbf{x}$ is first rotated by an orthonormal matrix (the norm is not changed yet), and then, its last $(N-M)$ elements are masked by zeros (the norm is kept or reduced).

I may do some derivations or calculations as follows:

  • $\mathbf{A}$ has an SVD of $\mathbf{A}=\mathbf{U\Sigma V}^\top$, where $\mathbf{U}_{M\times M}$ and $\mathbf{V}_{N\times N}$ are orthonormal matrices, and $\mathbf{\Sigma}\in\mathbb{R}^{M\times N}$ contains the singular values $\sigma_i=\mathbf{\Sigma}_{ii}~(i=1,\cdots,M)$.

  • $\mathbf{AA}^\top=(\mathbf{U\Sigma V^\top})(\mathbf{U\Sigma V^\top})^\top=\mathbf{U\Sigma \Sigma^\top U^\top}=\mathbf{I}_M$.

  • $\mathbf{U}^{-1}(\mathbf{U\Sigma \Sigma^\top U^\top})=\mathbf{\Sigma \Sigma^\top U^\top}=\mathbf{U}^{-1}$
    $\mathbf{\Sigma \Sigma^\top U^\top U}=\mathbf{\Sigma \Sigma^\top}=\mathbf{I}_M$.

Based on the above calculations, it seems that all the singular values $\sigma_i$s should be ones. However, according to this post, there might be a lower bound on the norm of the matrix-vector product $\mathbf{Ax}$, specifically $\lVert \mathbf{Ax} \rVert \ge \sigma_{\min.} \lVert \mathbf{x} \rVert$. I am not sure if $\sigma_{\min.}=1$ in my case or if it is possible that $\sigma_{\min.}=0$.

I am still thinking$\ldots$


Some other related and important questions:

  1. Is it true to say that all the singular values of $\mathbf{A}$ are ones?

  2. Since $\mathbf{A}$ is rectangular, the solution space of equation $\mathbf{Ax}=\mathbf{0}$ should be $(N-M)$-dimensional. (Is this actually the “nullspace” of $\mathbf{A}$? Can we define $\mathcal{N}(\mathbf{A})=\{\mathbf{v}\mid\mathbf{Av}=\mathbf{0}\}$?) Therefore, I may guess that the lower bound of $\lVert \mathbf{Ax} \rVert$ should be zero, as there exist some $\mathbf{x}$s satisfying equation $\mathbf{Ax}=\mathbf{0}$.

  3. I may want also ask that, can we efficiently find a “complementary”(?) matrix $\mathbf{B}\in\mathbb{R}^{(N-M)\times N}$ satisfying that the combination of $\left[ \begin{array}{c}
    \mathbf{A}\\
    \mathbf{B}\\
    \end{array} \right]_{N\times N} $
    is an orthonormal matrix? Does there exist a fast algorithm for computers?

I am still confused and seeking clarification.


Thanks for reading my post!

Best Answer

If $\ a_i^\top\ $ is the $\ i^\text{th}\ $ row of $\ A\ $ then the identity $\ AA^\top=I\ $ tells you that $\ 1=a_i^\top a_i=\|a_i\|^2\ $ and $\ a_i^\top a_j=0\ $ for $\ i\ne j\ $. Extend $\ \big\{a_1,$$\,a_2,$$\,\dots,$$\,a_M\big\}\ $ to an orthonormal basis $\ \big\{a_1,$$\,a_2,$$\,\dots,$$\,a_N\big\}\ $ of $\ \mathbb{R}^{N\times1}\ $ and let $\ x=\sum_\limits{i=1}^N\xi_ia_i\ $. Then \begin{align} \|Ax\|^2&=\sum_{i=1}^M\big(a_i^\top x\big)^2\\ &=\sum_{i=1}^M\xi_i^2\\ &\le\|x\|^2 \end{align}