Construction of QR decomposition for a singular matrix

matricesmatrix decompositionunitary-matrices

Consider a matrix $A \in \mathbb{C}^{n \times n}$. Suppose it's non-singular. Then it's columns $a_1, a_2, …, a_n$ form a basis in $\mathbb{C}^n$.

Let's apply Gram-Schmidt process to them. We will obtain some basis $q_1, q_2, …, q_n$. The transition matrix $S$ from the basis $a$ to the basis $q$ is upper-triangular(by construction).
Now, consider the matrix $Q=\big[ q_1|q_2|…|q_n \big]$. It's columns are orthonormal $\implies$ $Q$ is unitary.

And now we know that the matrix $A$ can be represented as:
$$
AS=Q \\
A=QS^{-1}
$$

$S$ is invertible since it is a transition matrix. $S$ is upper-triangular $\implies$ $S^{-1}$ is upper-triangular as well.

E.g. we have shown that each non-degenerate matrix $A$ can be represented as a product of unitary and upper-triangular matrices.

But what if $A$ is singular? I've tried the following:

Let's select a basis of column space of $A$: $a_1, …, a_r$ (they are columns of $A$). And also let's reorder columns of $A$ s.t. these linearly independent vectors $a_1, …, a_r$ are the first $r$ vectors of the matrix. So, we've done permutation of columns of $A$. This operation can be represented by right-multiplication of A with some matrix $P$: $A \rightarrow AP$.

Now we apply Gram-Schmidt to the basis of column space(vectors $a_1,…,a_r$), and after extend the resulting basis to an ONB of $\mathbb{C}^n$:

$$q_1,q_2, …, q_n$$

We have constructed matrix $Q=\big[ q_1 | q_2 | … | q_n \big]$ that is unitary(since it's columns form an ONB).
The matrix $S$ s.t. $AP=QS$ is upper-triangular. Moreover, it looks like:
$$
\begin{bmatrix}
s_{11} & s_{12} & s_{13} & … & s_{1r} & … & s_{1n} \\
0 & s_{22} & s_{23} & … & s_{2r} & … & s_{2n} \\
0 & 0 & s_{33} & … & s_{3r} & … & s_{3n} \\
… & … & … & … & …. & … & … \\
0 & 0 & 0 & … & s_{rr} & … & s_{rn} \\
0 & 0 & 0 & … & 0 & … & 0\\
0 & 0 & 0 & … & 0 & … & 0\\
… & … & … & … & …. & … & … \\
0 & 0 & 0 & … & 0 & … & 0\\
\end{bmatrix}
$$

But how to show now that the matrix $SP^{-1}$ is upper-triangular? For me, it seems like it's not. Can you see a mistake in my thoughts? Maybe, you have other suggestions for construction of $QR$-decomposition in case of singular matrix? I'm stuck with this issue for a week now and I have no ideas what to do. Thanks!

Best Answer

conceptually the simplest approach using Gram Schmidt, is to lean on what you already know. Suppose $\text{rank}\big(A\big)=r$.

1.) run Gram Schmidt on $A$ as you normally do, and each time you run into a linearly dependent column, discard it. The end result is

$$A' = Q'R'=\bigg[\begin{array}{c|c|c|c} \mathbf q_1' & \mathbf q_2' &\cdots & \mathbf q_{r}'\end{array}\bigg]R'$$

where $A'$ is $n\times r$ and $R$ is $r\times r$ and upper triangular. This is equivalent to running QR facotrization on a tall skinny matrix with full column rank.

2.) Now extend $Q'$ to an orthonormal basis (i.e. 'find' $n-r$ linearly independent vectors not in the span of $A'$ and run more Gram Schmidt. Common places to look -- left nullspace of $A$ or use a random number generator, ref here: In $A = QR$ factorization do we need basis vectors from the left null space?)

3.) Now one at a time re-insert the deleted columns to recreate $A$. E.g. suppose $\mathbf a_1$ was non-zero but $\mathbf a_2$ was linearly dependent (and hence deleted when constructing $A'$). So re-insert $\mathbf a_2$ on the LHS

$$A'=\bigg[\begin{array}{c|c|c|c} \mathbf a_1' & \mathbf a_2' &\cdots & \mathbf a_{r}'\end{array}\bigg]=\bigg[\begin{array}{c|c|c|c} \mathbf a_1 & \mathbf a_2' &\cdots & \mathbf a_{r}'\end{array}\bigg]\rightarrow \bigg[\begin{array}{c|c|c|c|c} \mathbf a_1 & \mathbf a_2& \mathbf a_2' &\cdots & \mathbf a_{r}'\end{array}\bigg]$$

Now to make dimensions match on the RHS, first insert some $\mathbf q_j$ that you found when doing step (2) in the appropriate column (i.e. 2nd column)
$$\bigg[\begin{array}{c|c|c|c} \mathbf q_1' & \mathbf q_2' &\cdots & \mathbf q_{r}'\end{array}\bigg]\rightarrow \bigg[\begin{array}{c|c|c|c} \mathbf q_1'& \mathbf q_j & \mathbf q_2' &\cdots & \mathbf q_{r}'\end{array}\bigg]$$

and you need to insert a row and column into $R'$ to make dimensions match (here $k=1$)
$$R'=\begin{bmatrix} R_{k\times k}' & * \\ \mathbf 0 & R_{r-k\times r-k}'\end{bmatrix}\rightarrow \begin{bmatrix} R_{k+1\times k+1}' & * \\ \mathbf 0 & R_{r-k\times r-k}'\end{bmatrix}$$ and the RHS is still upper triangular because its two diagonal blocks are upper triangular, i.e. one of them is unchanged (bottom right) and the other (top right) had a row and column inserted on the bottom and right respectively, but this is still upper triangular because $\mathbf a_2$ is written a linear combination of $\mathbf a_i$ with lower indices (i.e. it is linearly dependent).

Thus the structure holds one step at a time and after finitely many steps you recover $A=QR$

remark 1:
QR factorization is not unique for non-invertible $A$

remark 2:
a much cleaner way of getting to this result involves using Householder matrices. The proof is essentially is the same as when using Householder matrices for QR factorization of invertible A, except you skip the step at iteration $j$ if the block submatrix you are looking at has a zero in column j (which occurs iff $\mathbf a_j = \sum_{k=1}^{j-1}x_k \mathbf a_k$).

Related Question