[Math] Implications of zero row when row reducing matrix

linear algebra

Often when I am performing elementary row operations to row reduce an arbitrary $A_{m \times n}$ matrix, a row of 0's appears, $[0 \, \, 0 \, … \, 0\, \, 0]$.

I am uncertain, does this imply either or both of the following:

  • a row in $A$ is a linear combination of other rows
  • a column in $A$ is a linear combination of other columns

Does having linearly dependent columns imply linearly dependent rows, or vice versa?

Best Answer

It implies only that the rows are dependent. When you perform row reduction, you do not change the row space (the space spanned by the rows of the matrix). Therefore, if you get a row of 0's in the reduction process, then the row space is spanned by fewer than $m$ elements, and therefore the original rows were dependent.

Even if the rows are dependent, the columns may or may not be dependent. Think of a tall $N$ by 1 matrix. The rows are dependent (if $N>1$), but there is only one column, so (assuming the column is not all zeroes), there is no column dependence.