[Math] Finding a Basis for S$^\perp$

linear algebramatricesorthogonality

So I was working through this review question and got stumped. My answer isn't completely orthogonal to matrices in a certain subspace, so it's incorrect.

The question is:

Let S be a subspace of $\mathbb{R}$$^4$ spanned by x$_1$ = (1, 0, -2, 1)$^T$ and x$_2$ = (0, 1, 3, -2)$^T$. Find a basis for S$^\perp$.

My attempt:

So I find the transpose of the two vectors put together in a matrix:

\begin{pmatrix}1&0&-2&1\\0&1&3&-2 \end{pmatrix}

And so this is already in reduced row echelon form, solving the enhanced matrix gives me this:

x$_1$ = 2x$_3$ – x$_4$

x$_2$ = -3x$_3$ + 2x$_4$

where x$_3$, x$_4$ are free variables.

And so I plug in whatever I want into the free variables and I end up getting orthogonal vectos corresponding to individual rows of the transpose matrix.

Essentially, the x$_1$$^T$ vector multiplied by the vector obtained from x$_1$ = 2x$_3$ – x$_4$ will equal 0. However, it doesn't make the matrix become 0 when multiplied, so it's not really a basis for S$^\perp$.

Can I get some clarification on what I'm doing wrong, please?

Best Answer

This is basically the same problem as the first part of your previous question. You were doing fine until you tried to extract the nullspace basis vectors from the rref matrix.

Columns three and four don’t have leading entries (aka pivots), so those components of the basis vectors will be filled in with zeros and ones, as I explained in my answer to the previous question. That is, set all but one of the components which correspond to columns in the rref matrix that don’t have leading entries to $0$, and set the remaining component to $1$. It’s usually best to start from the top and work your way down so that you don’t lose track of where you are in the process. So, in this case the two basis vectors will be of the form $(x_1,x_2,1,0)^T$ and $(y_1,y_2,0,1)^T$.

You can fill in the other components of these vectors by solving equations or by simply reading them from the third and fourth columns. Multiplying the rref matrix by the first vector gives the equation $(x_1-2,x_2+3)^T=0$, so one basis vector is $(2,-3,1,0)^T$. Note that the first two components are simply the negated elements of column three. Similarly, the missing components of the second basis vector are supplied by the fourth column: $(-1,2,0,1)^T$.

To check this, $$ \pmatrix{1&0&-2&1\\0&1&3&-2}\pmatrix{2&-1\\-3&2\\1&0\\0&1}=\pmatrix{0&0\\0&0}. $$

The last two components of these vectors are actually free, but using the fill-in-with-one-and-zeros scheme ensures that you don’t accidentally make them linearly dependent and allows you to simply read the values from the rref matrix.