QR decomposition using Householder reflections: how to calculate Q

linear algebralinear-transformationsnumerical methods

In a recent assignment, I was asked to develop a program that could solve some specific problem using QR decomposition to find eigenvalues and eigenvectors. That assignment also specified that we should use Householder reflections to find the QR decomposition of a matrix.
To find the Householder transformations of a matrix, one can do it in two ways:

  1. Given a vector $\vec v$, we can define the transformation matrix $H_v$ as $$H_v = I_n – 2*(\vec v\ \vec v^\intercal)/(\vec v\cdot \vec v)$$
    where $I_n$ is the identity matrix of size $n$. Then the transformation is applied by simply multiplying $H_v$ by the desired matrix.
  2. To avoid doing some $\mathit O(n^2)$ calculations, it is suggested that we apply the following formula to apply the Householder transformation on a vector $\vec x$:$$H_v\vec x = \vec x – {\vec v \cdot \vec x \over \vec v \cdot \vec v}\vec v$$ that way it is possible to obtain the transformed vector doing only dot products and scalar multiplications. One can apply this transformation to every column in a matrix in order to obtain the resulting transformation.

For the QR decomposition you can, for an $n \times n$ matrix $A$ do the following transformations: in the ith iteration, you set $\vec v_i = \vec a_i + sign(a_{ii})* \Vert \vec a_i \Vert \vec e_i $ (where is $\vec a_i$ is the ith column of the matrix, $sign$ is the signal of a number and $\vec e_i$ is the ith canonical vector) and apply a Householder transformation (using method two listed above) to the matrix. After $n-1$ iterations you should reach an upper triangular matrix, that will be matrix $R$.
Now, matrix $Q$ is defined as the product of all Householder transformations you used to find $R$ ($Q = H_{v_1} H_{v_2} … H_{v_{n-2}} H_{v_{n-1}}$).

In the context of the problem, I need to do multiple decompositions to find a matrix eigenvalues and eigenvectors. Assuming the eigenvalues are $\vert \lambda_1 \vert \gt \vert \lambda_2 \vert \gt … \gt \vert \lambda_{n-1} \vert \gt \vert \lambda_n \vert$ you may approximate them using the following method:
$$
\begin{array}{c|c|c}
\text{Iteration} & \text{1# step} & \text{2# step}\\
\hline
1 & A_1 = Q_1R_1 & A_2 = R_1Q_1 \\
2 & A_2 = Q_2R_2 & A_3 = R_2Q_2 \\
\vdots & \vdots & \vdots \\
k-1 & A_{k-1} = Q_{k-1}R_{k-1} & A_k = R_{k-1}Q_{k-1}
\end{array}$$

$A_k$ will converge to an upper triangular matrix where the diagonal elements are the original matrix eigenvalues. Now, to find the eigenvectors you should multiply all the matrices $Q_i$ (assuming the original matrix was symmetric) which will give a new matrix $V$ ($V = Q_1 Q_2 \ … Q_{k-1} Q_k$) where each column represents the eigenvector associated with the eigenvalue of the same column (in the matrix $A_k$).

What I don't know is, what is the point to use the second method to find $R$ if you need all the transformations matrices to find $Q$? Won't you need to calculate them all? Also, to find $V$ you need to have all matrices $Q$ explicitly calculated, don't you?

Also, I have asked my professor and he wasn't sure about what I should do, but he gave the suggestion to calculate the first transformation matrix and use the following $v$s I found to multiply $H_{v_1}$ in the same way I did with $R$. Although, it didn't work correctly for me. I was hoping someone to know a better method to find $Q$.

Best Answer

To solve a system $Ax = b$ with the QR-decomposition of $A=QR$ we ususally do this in two steps. First compute

$$y = Q^t b \qquad \text{ then } \qquad y = R^{-1}b$$

Since $y^t = b^t Q$ you have to compute multiple products of the form $w^t H_v$ which can be done exactly the same way as you already described. This way you can avoid computing $Q$ alltogether, and this is also the way it is sometimes done in numerical applications. (see e.g. Numerical Recipes, chapter 2.10)