[Math] Column space of a matrix

linear algebramatricesnumerical linear algebra

Recently I started learning about matrices and know for example that the pivot columns of a matrix form a basis for the column space of this matrix. I just can't seem to find out when two matrices have the same column space. I also heard something about column reducing a matrix to see if two matrices have the same column space, although I have no idea how reducing with columns works. Is column reducing a matrix the same as row reducing the transpose of this same matrix?

Maybe it is easier to tell when two matrices don't have the same column space? Can I for example use the fact that if two matrices don't have the same dimension for the column space, that the column spaces cannot be equal?

Best Answer

Lemma 1: Given an $m\times n$ matrix $A,$ the null space of $A^T$ is the orthogonal complement of the column space of $A.$

Proof: Write $A=[c_1\:\cdots\:c_n]$ where the $c_j$ are the columns of $A,$ and note that for any $m$-dimensional vector $x$ we have $$A^Tx=\left[\begin{array}{c}c_1^T\\\vdots\\c_n^T\end{array}\right]x=\left[\begin{array}{c}c_1^Tx\\\vdots\\c_n^Tx\end{array}\right]=\left[\begin{array}{c}c_1\cdot x\\\vdots\\c_n\cdot x\end{array}\right].$$ Since the column space of $A$ is spanned by $c_1,...,c_n$, then $x$ is in the orthogonal complement to the column space of $A$ if and only if $x$ is orthogonal to each $c_j$ if and only if each $c_j\cdot x=0$ if and only if $A^Tx$ is the $n$-dimensional zero vector if and only if $x$ is in the null-space of $A^T.$ $\Box$

Lemma 2: Let $V,W$ be subspaces of some finite-dimensional space $X$. $V$ and $W$ have the same orthogonal complement if and only if $V=W$.

Proof: If $V=W$, then their orthogonal complements are trivially the same.

Suppose $V,W$ have the same orthogonal complement. Take $v\in V$. Since $X$ is the direct sum of $W$ and its orthogonal complement, and since $x\in V\subseteq X$, then there exist unique $w,w'$ such that $v=w+w',$ $w\in W$, and $w'$ in the orthogonal complement of $W$. Since $V,W$ have the same orthogonal complement, then $w'$ is orthogonal to $v,$ and so $$0=v\cdot w'=(w+w')\cdot w'=w\cdot w'+w'\cdot w'.\tag{$\star$}$$ Since $w'$ is in the orthogonal complement of $W$ and $w\in W$, then $w\cdot w'=0$, so it follows by $(\star)$ that $$w'\cdot w'=0.$$ Now, no non-zero vector is self-orthogonal, so $w'$ must be the zero vector, whence $v=w\in W$, and so $V\subseteq W$. By symmetrical arguments, we likewise have $W\subseteq V$, so $V=W$. $\Box$

Proposition: Given matrices $A,B$ of the same dimensions, $A$ and $B$ have the same column space if and only if $A^T$ and $B^T$ have the same reduced row echelon form.

Proof: Let $rref(M)$ indicate the reduced row echelon form of a matrix $M$. Recall that we can obtain $rref(M)$ by Gauss-Jordan elimination, which involves multiplication on the left by some finite collection of elementary matrices--that is, for any $M$, there exist elementary matrices $E_1,\cdots,E_n$ of appropriate dimension such that $rref(M)=E_n\cdots E_1M.$ This collection of elementary matrices is not unique, but that isn't important. Note, though, that elementary matrices are invertible, so it follows that the null spaces of $rref(M)$ and $M$ are the same.

Thus, $A^T$ and $B^T$ have the same reduced row echelon form if and only if they have the same null space. By Lemma 1, $A^T$ and $B^T$ have the same null space if and only if the column spaces of $A$ and $B$ have the same orthogonal complement. By Lemma 2, the column spaces of $A$ and $B$ have the same orthogonal complement if and only if the column spaces of $A$ and $B$ are the same. $\Box$


Upshot: The Proposition lets us get around needing to know what the column spaces of two matrices are, and simply determine whether they have the same column space by converting their transposes to reduced row echelon form.