Linear Algebra – Why Do Elementary Row Operations Preserve Linear Dependence?

linear algebra

I know that I can find a basis for the column space of a matrix $A$ by reducing the matrix to reduced row echelon form $J$. The columns of $A$ corresponding to the linearly independent columns of $J$ then form a basis for $Col(A)$, because linear dependence is preserved under elementary row operations. I can't figure out why this is true though, and a google search returns nothing, so I'm sure it's simple. Can someone give me a proof?

Best Answer

Let's start out from the standard basis $e_1,..,e_n$. Let $a_1,..,a_k$ be the column vectors of $A$.

Check that the step on rows $r_i':=r_i+\lambda\,r_j$ corresponds to the basis transformation $e_j':=e_j-\lambda\,e_i$, that is, for a vector $v$ we have $$v=\sum_i\alpha_ie_i=\sum_i\alpha_i'e_i'$$ where the row transformation is made for the coordinate vector $\pmatrix{\alpha_1\\ \alpha_2\\ \vdots} \leadsto \pmatrix{\alpha_1'\\ \alpha_2'\\ \vdots}$.

So, in this interpretation the column vectors all "stay" where they are in the $n$ dimensional space, but we keep on changing the basis. Of course, the vectors stay (in-)dependent. The crucial thing here is that we get another basis when applying (the inverse of) each step of the row transformation.

Related Question