Significance of the row space

linear algebravector-spaces

I became to study linear algebra a bit on my own. It is already more than twenty years since my graduation. I ended up to my question due to practical problem that I have already solved. Having said that, my practical problem lead me to trying understand a concept of a column space, row space, null space and left null space.

I have developed a intuition or concept of a column space. In my thinking matrix columns are vectors in given space. Further on, in my thinking a column space is a space spanned by matrix columns. All that makes sense to me. I just rely on interpretation of a matrix as collection of vectors.
Also, somehow I have become to understand a null space as a special space that will lead to matrix equation of Ax=0. In my thinking, importance of a null space is that there can't be inverse for a such transformation.

Now, my difficulties arise in understanding a row space. It is useless to repeat definition of a row space as Col(A'). The definition doesn't really open this for me. I'd like to understand significance of a row space. In my thinking, columns of a matrix are vectors. This understanding doesn't really fit to a row or a row space. Rows are not vectors, hence a row space is not similar space as a vector space spanned by vectors. Yes, technically it is so or may be, as I have understood. But I am still missing intuition or understanding of concept of a row space.
What is significance of a row space, how does it manifest itself?

And very similarly, what is importance of a left null space, how does it manifest itself?

You see, I am not a mathematician nor student of math. I am an engineer. And due to my personal way of learning, I am always trying to visualize or develop some kind of intuition before diving into the details and rigorous definitions.

Thanks in advance

  • PoincarĂ© look-alike –

Best Answer

We can think of vectors as either rows or as columns, and can view matrix vector multiplication as acting with either: $x \mapsto Ax$ when $x$ is a column, $v \mapsto vA$ when $v$ is a row. So the column space and row space of a matrix have equal importance, they are essentially interchangeable depending on how we are working with the matrix and vectors.

In fact, you can find entire books which define vectors as rows and not columns, all of the definitions are adjusted slightly to account for this change and everything works as expected.

Sometimes we want a matrix to act on both rows and columns: for example take a bilinear form $B(u,v) = u^{T}Av$. Now consider the following question: is there a vector $u_{0} \neq 0$ such that $B(u_{0},v) = 0$ for all $v \in V$?

We will have $B(u_{0},v) = u_{0}^{T}Av = 0$ for all $v \in V$ precisely when $u_{0}^{T}A = 0$, in other words, when $u_{0}$ is in the left null space of $A$.

Note if you are interested in an advanced perspective on this: the formal connection between these two points of view (row vectors/column vectors) is provided by the concept of the dual space of a vector space; it states (abbreviated form) that if you are working with e.g. column vectors, then you can think of a one-row matrix as a linear function into the base field, and the collection of such functions also form a vector space (that in the finite-dimensional case is isomorphic to the original vector space).