I wanted to check if I understand this correctly, or maybe it can be explained in a simpler way: why is matrix rank equal to the number of linearly independent rows?
The simplest proof I can come up with is: matrix rank is the number of vectors of the basis of vector space spanned by matrix rows (row space). All bases of a given vector space have the same size.
Elementary operations on the matrix don't change its row space, and therefore its rank.
Then we can reduce it to row echelon form (reduced row echelon form is not necessary, because I think the non-zero rows in row echelon form are linearly independent already). So we might pick only the rows that are non-zero and still get the same row space (adding or removing arbitrary number of zero rows don't change a thing), and because these rows are linearly independent, they are basis for the row space. As mentioned above, all bases have the same size, so number of linearly independent vectors is equal to matrix rank (the dimension – size of basis – of row space).
Is it correct? Didn't I make it too complicated?
Best Answer
Two facts about elementary row operations are useful to resolve this question:
After a matrix is fully reduced, it's not hard to see that the number of linearly independent columns is the number of pivot elements, and the number of linearly independent rows is the number of pivot elements. Therefore the number of linearly independent rows. Since the row operations don't change the number of linearly independent columns or the number of linearly independent rows, those two quantities must be the same in every matrix.
Consequently one can define “rank” of a matrix either
and it's the same thing either way.