[Math] Finding an orthogonal basis from a column space

linear algebramatrices

I'm having issues with understanding one of the exercises I'm making.

I have to find an orthogonal basis for the column space of $A$, where:

$$A = \begin{bmatrix}
0 & 2 & 3 & -4 & 1\\
0 & 0 & 2 & 3 & 4 \\
2 & 2 & -5 & 2 & 4\\
2 & 0 & -6 & 9 & 7
\end{bmatrix}.$$

The first question was to find a basis of the column space of $A$, clearly this is simply the first $3$ column vectors (by reducing it to row echelon form, and finding the leading $1$'s).

However, then I had to find an orthogonal basis out of the column space of $A$, and here is where I get lost. I started off with finding the first vector:

$$u_1 = \begin{bmatrix}0\\0\\2\\2\\\end{bmatrix}.$$

Then I thought I would find the second vector like this:

$$u_2 = \begin{bmatrix}2\\0\\2\\0\\\end{bmatrix}-\left(\begin{bmatrix}2\\0\\2\\0\\\end{bmatrix}\cdot\begin{bmatrix}0\\0\\2\\2\\\end{bmatrix}\right)*\begin{bmatrix}0\\0\\2\\2\\\end{bmatrix} = \begin{bmatrix}2\\0\\2\\0\\\end{bmatrix}-4*\begin{bmatrix}0\\0\\2\\2\\\end{bmatrix} = \begin{bmatrix}2\\0\\-6\\-8\\\end{bmatrix}.$$

However, according to the result sheet we were given, instead of having a $4$, I should have $\frac{4}{8}$. I somehow can not figure out what I am missing, since the dot product of the two vectors clearly is $4$.

Also, as a second question: if I had to find a orthonormal basis I would only have to take the orthogonal vectors found here, and multiply them by their $1$/length, correct?

Best Answer

Your basic idea is right. However, you can easily verify that the vectors $u_1$ and $u_2$ you found are not orthogonal by calculating $$<u_1,u_2> = (0,0,2,2)\cdot \left( \begin{matrix} 2 \\ 0 \\ -6 \\ -8 \end{matrix} \right) = -12-16 = -28 \neq 0$$ So something is going wrong in your process.

I suppose you want to use the Gram-Schmidt Algorithm to find the orthogonal basis. I think you skipped the normalization part of the algorithm because you only want an orthogonal basis, and not an orthonormal basis. However even if you don't want to have an orthonormal basis you have to take care about the normalization of your projections. If you only do $u_i<u_i,u_j>$ it will go wrong. Instead you need to normalize and take $u_i\frac{<u_i,u_j>}{<u_i,u_i>}$. If you do the normalization step of the Gram-Schmidt Algorithm, of course $<u_i,u_i>=1$ so it's usually left out. The Wikipedia article should clear it up quite well.

Update

Ok, you say that $v_1 = \left( \begin{matrix} 0 \\ 0 \\ 2 \\ 2 \end{matrix} \right), v_2 = \left( \begin{matrix} 2 \\ 0 \\ 2 \\ 0 \end{matrix} \right), v_3 = \left( \begin{matrix} 3 \\ 2 \\ -5 \\ -6 \end{matrix} \right)$ is the basis you start from. As you did you can take the first vector $v_1$ as it is. So you first basis vector is $u_1 = v_1$ Now you want to calculate a vector $u_2$ that is orthogonal to this $u_1$. Gram Schmidt tells you that you receive such a vector by

$$u_2 = v_2 - \text{proj}_{u_1}(v_2)$$

And then a third vector $u_3$ orthogonal to both of them by $$u_3 = v_3 - \text{proj}_{u_1}(v_3) - \text{proj}_{u_2}(v_3)$$

You did do this approach. What went wrong is your projection. You calculated it as $$ \text{proj}_{u_1}(v_2) = v_2<u_1,v_2>$$ but this is incorrect. The true projection is $$ \text{proj}_{u_1}(v_2) = v_2\frac{<u_1,v_2>}{<u_1,u_1>}$$ As I tried to point out, some textbooks will skip the division by $<u_1,u_1>$ in the explanation of Gram-Schmidt, but this is because in most cases you want to construct an orthonormal basis. In that case you normalize every $u_i$ before proceeding to the next step. Therefore $<u_i,u_i> = 1$ can be skipped.

So what you need to change is to divide by $<u_2,u_2> = 8$ in your projection.