Finding good bases to represent any rectangular matrix as a block matrix with identity submatrix

diagonalizationlinear algebralinear-transformationsmatricesmatrix decomposition

This question is a generalization of Finding bases such that the matrix representation is a block matrix where one submatrix is the identity matrix .

Question

For any linear map $L:\mathbb{R}^n \to \mathbb{R}^m$ where $n\neq m$,
given its matrix representation $[L]^{\mathcal{E}_n}_{\mathcal{E}_m}$, say $\begin{pmatrix}a_{1,1} & \dots & a_{1,n} \\ \vdots & \ddots & \vdots \\ a_{m,1} & \dots & a_{m,n}\end{pmatrix}$, with respect to the standard basis $\mathcal{E}_n$ of $\mathbb{R}^n$ and $\mathcal{E}_m$ of $\mathbb{R}^m$,
must we be able to find basis $\alpha$ for $\mathbb{R}^n$ and $\beta$ for $\mathbb{R}^m$ such that
$[L]^{\alpha}_{\beta} = \begin{pmatrix}\mathbf{I}_{r} & \mathbf{O} \\ \mathbf{O}& \mathbf{O} \end{pmatrix}$,
where $\mathbf{I}_{r}$ is an $r\times r$ identity matrix with $r=\text{Rank}(L)$, and $\mathbf{O}$'s are some zero matrices?
If yes, what are the systematic ways (if any) to find it?

Thoughts

My professor casually said that it is true and left it as an exercise, giving hints along the lines of "do row / column operations to get the change of basis matrices".
It was used in subsequent proofs in the class so probably it is really true.

The closest I know / can find (which are more sophisticated than "just" row / column operations) are

  1. Diagonalization, which is for $n=m$ and the diagonal entries are eigenvalues, and
  2. Singular Value Decomposition, which is for $n\neq m$ but still gives $\begin{pmatrix}\mathbf{D} & \mathbf{O} \\ \mathbf{O}& \mathbf{O} \end{pmatrix}$ only where $\mathbf{D}$ is a diagonal matrix.

$ \begin{pmatrix}\mathbf{I}_{r} & \mathbf{O} \\ \mathbf{O}& \mathbf{O} \end{pmatrix}$ sounds too good to be true… (but I am still a beginner in Linear Algebra)
I wonder if some more conditions are needed?

I also tried a bunch of keywords in Google but could not find anything.
(are there names for "a block matrix with identity submatrix"?)
I apologize if my question is not phrased in the standard way.
I would appreciate if there are some pointers.

Thank you in advance.

Best Answer

  1. Gaussian elimination.

Each Gaussian 'row move' can be represented by an elementary row matrix; similarly for column moves. Hence applying the Gaussian row/column operations is effectively the same as $$E_rAE_c = \begin{pmatrix}I&O\\O&O\end{pmatrix}=:I'$$ where $E_r=E_1\cdots E_k$ is the product of the row operations applied to $A$. So taking their inverses gives $A=E_r^{-1}I'E_c^{-1}$ as required.

  1. Singular value decomposition.

If SVD is known for $A$, that is, $A=UDV^\top$ (with $U,V$ square), then write $$D=\begin{pmatrix}P&O\\O&O\end{pmatrix}=\begin{pmatrix}R&O\\O&I\end{pmatrix}\begin{pmatrix}I&O\\O&O\end{pmatrix}\begin{pmatrix}R&O\\O&I\end{pmatrix}=R'I'R'$$ where $P$ is a diagonal matrix of strictly positive numbers $\sigma>0$ and $R$ is also diagonal consisting of their square roots $\sqrt{\sigma}$. Then $$A=(UR')I'(R'V)^\top.$$

Between the two, however, the first requires many less steps than SVD. To use SVD to find the called-for bases is like using a golden hammer on a rusty nail.

Related Question