To get to $(A)$ and proceed from there to show this equation corresponds to $c_i^2 + s_i^2 = 1$, we need to get to $V_1^\dagger = V_2^\dagger$.
To get there consider the "$QR$" decomposition of $Q_2V_1$ matrix. We can write it as:
$$
Q_2V_1 = U_2R\\
Q_2 = U_2RV_1^\dagger
$$
where $U_2$ is an orthogonal matrix and $R$ is an upper diagonal matrix.
We have $Q_2Q_2^\dagger = I$ ($Q_2$ is full column rank with orthonormal columns). Therefore:
$$
(U_2RV_1^\dagger)(VR^\dagger U_2^\dagger) = I \\
U_2 R R^\dagger U_2^\dagger = I \\
R R^\dagger = U_2^\dagger U_2 = I \\
$$
Hence $R$ must be a diagonal matrix, lets call it $D_2$. Rewriting $Q_2$ we get
$$
Q_2V_1 = U_2D_2 \\
Q_2 = U_2D_2V_1^\dagger \\
$$
which is same the SVD of $Q_2 = U_2D_2V_2^\dagger$. Therefore $V_2^\dagger = V_1^\dagger$.
Now using the condition $Q_1^\dagger Q_1+Q_2^\dagger Q_2=I$, we get:
$$
(V_1D_1^\dagger U_1^\dagger)(U_1D_1V_1^\dagger) + (V_1D_2^\dagger U_2^\dagger)(U_2D_2V_1^\dagger)) = I \\
V_1 D_1^\dagger D_1 V_1^\dagger + V_1 D_2^\dagger D_2 V_1^\dagger = I \\
V_1(D_1^\dagger D_1 + D_2^\dagger D_2)V_1^\dagger = I \\
D_1^\dagger D_1 + D_2^\dagger D_2 = V_1^\dagger V_1 = I \\
\sum_k (d^{(1)}_k)^2 +\sum_k (d^{(2)}_k)^2 = I \\
$$
if $d^{(1)}_i = c_i$ and $d^{(2)}_i = s_i$, then $c_i^2 + s_i^2 = 1$ for $i = 1, 2, .., p$
conceptually the simplest approach using Gram Schmidt, is to lean on what you already know. Suppose $\text{rank}\big(A\big)=r$.
1.) run Gram Schmidt on $A$ as you normally do, and each time you run into a linearly dependent column, discard it. The end result is
$$A' = Q'R'=\bigg[\begin{array}{c|c|c|c} \mathbf q_1' & \mathbf q_2' &\cdots & \mathbf q_{r}'\end{array}\bigg]R'$$
where $A'$ is $n\times r$ and $R$ is $r\times r$ and upper triangular. This is equivalent to running QR facotrization on a tall skinny matrix with full column rank.
2.) Now extend $Q'$ to an orthonormal basis (i.e. 'find' $n-r$ linearly independent vectors not in the span of $A'$ and run more Gram Schmidt. Common places to look -- left nullspace of $A$ or use a random number generator, ref here: In $A = QR$ factorization do we need basis vectors from the left null space?)
3.) Now one at a time re-insert the deleted columns to recreate $A$. E.g. suppose $\mathbf a_1$ was non-zero but $\mathbf a_2$ was linearly dependent (and hence deleted when constructing $A'$). So re-insert $\mathbf a_2$ on the LHS
$$A'=\bigg[\begin{array}{c|c|c|c} \mathbf a_1' & \mathbf a_2' &\cdots & \mathbf a_{r}'\end{array}\bigg]=\bigg[\begin{array}{c|c|c|c} \mathbf a_1 & \mathbf a_2' &\cdots & \mathbf a_{r}'\end{array}\bigg]\rightarrow \bigg[\begin{array}{c|c|c|c|c} \mathbf a_1 & \mathbf a_2& \mathbf a_2' &\cdots & \mathbf a_{r}'\end{array}\bigg]$$
Now to make dimensions match on the RHS, first insert some $\mathbf q_j$ that you found when doing step (2) in the appropriate column (i.e. 2nd column)
$$\bigg[\begin{array}{c|c|c|c} \mathbf q_1' & \mathbf q_2' &\cdots & \mathbf q_{r}'\end{array}\bigg]\rightarrow \bigg[\begin{array}{c|c|c|c} \mathbf q_1'& \mathbf q_j & \mathbf q_2' &\cdots & \mathbf q_{r}'\end{array}\bigg]$$
and you need to insert a row and column into $R'$ to make dimensions match (here $k=1$)
$$R'=\begin{bmatrix} R_{k\times k}' & * \\ \mathbf 0 & R_{r-k\times r-k}'\end{bmatrix}\rightarrow \begin{bmatrix} R_{k+1\times k+1}' & * \\ \mathbf 0 & R_{r-k\times r-k}'\end{bmatrix}$$
and the RHS is still upper triangular because its two diagonal blocks are upper triangular, i.e. one of them is unchanged (bottom right) and the other (top right) had a row and column inserted on the bottom and right respectively, but this is still upper triangular because $\mathbf a_2$ is written a linear combination of $\mathbf a_i$ with lower indices (i.e. it is linearly dependent).
Thus the structure holds one step at a time and after finitely many steps you recover $A=QR$
remark 1:
QR factorization is not unique for non-invertible $A$
remark 2:
a much cleaner way of getting to this result involves using Householder matrices. The proof is essentially is the same as when using Householder matrices for QR factorization of invertible A, except you skip the step at iteration $j$ if the block submatrix you are looking at has a zero in column j (which occurs iff $\mathbf a_j = \sum_{k=1}^{j-1}x_k \mathbf a_k$).
Best Answer
Yes, it is possible, and it is quite straightforward:$$R=H_nH_{n-1}\dots H_1A,$$ where $H_*$ are the Householder reflectors. So, $$Q = H_1^T\dots H_{n-1}^TH_n^T.$$
As $H_*$ are symmetric: $$Q = H_1\dots H_{n-1}H_n.$$
Now, to compute $Q_1$, we have:$$Q_1 = H_1\dots H_{n-1}H_nI^{m \times n},$$where $I^{m \times n} \in \mathrm{R}^{m \times n}$ is the rectangular identity matrix.
$Q_1$ can be calculated backwards: first calculate $H_nI^{m \times n}$, the result is a $m \times n$ matrix. Then multiply left this with $H_{n-1}$, then $H_{n-2}$, and so on.
This way, the full $Q$ doesn't have to be formulated.