Row rank in invariant under elementary operations

linear algebra

Suppose $A$ is the matrix of the size $m\times n$.

My book gives the following definition of the row rank of $A$: we say that $A$ has row rank equal to $r$ if there are $r$ linearly independent rows that are maximal in the following sense: no collection of rows with more that $r$ elements will be linearly independent. Denote the row rank of $A$ by $r_1(A)$.

I am trying to prove that $r_1(A)$ is invariant under elementary
operations over rows and columns.

The case of elementary operations over columns is trivial one. But I have some difficulties with elementary operations over rows. The case when elementary operation is just multiplication of some row by $\lambda\neq 0$ was easy. Let's consider the elementary operation when $i$th row is switched by the sum of $i$th and $j$th. More precisely:

Let $A$ be the $m\times n$ matrix with $r_1(A)=r$ and $0<r<m$. Let $\mathbf{a}^1,\dots,\mathbf{a}^m$ its rows and WLOG assume that the first $r$ of them are maximal linearly independent rows. Let $B$ the matrix which is obtained from $A$ by the following elementary operation:

$$A=\begin{bmatrix}
\mathbf{a}^1 \\
\mathbf{a}^2 \\
\vdots \\
\mathbf{a}^m \\
\end{bmatrix} \ \longrightarrow \begin{bmatrix}
\mathbf{a}^1+\mathbf{a}^{r+1} \\
\mathbf{a}^2 \\
\vdots \\
\mathbf{a}^r \\
\vdots \\
\mathbf{a}^m \\
\end{bmatrix}=B$$

In other words, I've just switched the row $\mathbf{a}^1$ by $\mathbf{a}^1+\mathbf{a}^{r+1}$, where $\mathbf{a}^{r+1}$ is not in the collection of maximal linearly independent vectors $\{\mathbf{a}^1,\dots, \mathbf{a}^r\}$.

Can anyone explain to me how to show that $r_1(B)=r$? I am trying to prove that in $A$ the rows $\mathbf{a}^1+\mathbf{a}^{r+1},\mathbf{a}^2, \dots,\mathbf{a}^r$ are linearly independent but I failed.

Would be very grateful for the help. And please give the answer just using these definitions.

BTW this is not a duplicate of the question.

Best Answer

The vectors $\mathbf{a}^1+\mathbf{a}^{r+1},\mathbf{a}^2, \dots,\mathbf{a}^r$ in general will not be linearly independent because it might be that $\mathbf{a}^{r+1}=-\mathbf{a}^1$, so you get the zero vector in the set, making it automatically dependent.

However, if it is the case that $\mathbf{a}^1+\mathbf{a}^{r+1},\mathbf{a}^2, \dots,\mathbf{a}^r$ are dependent, then the vectors $\mathbf{a}^{r+1},\mathbf{a}^2, \dots,\mathbf{a}^r$ will be independent. The proof is below, I tried hiding it with the spoiler tag so that you can try it out by yourself first, but the tag doesn't seem to work when MathJax is involved.


Suppose that both sets $\{\mathbf{a}^1+\mathbf{a}^{r+1},\mathbf{a}^2, \dots,\mathbf{a}^r\}$ and $\{\mathbf{a}^{r+1},\mathbf{a}^2, \dots,\mathbf{a}^r\}$ are dependent. Then there are two groups of coefficients, $\alpha_1,\dots,\alpha_r$ and $\beta_1,\dots,\beta_r$ such that in each group not all coefficients are zero and the following holds:

$$\alpha_1(\mathbf{a}^1+\mathbf{a}^{r+1})+\alpha_2\mathbf{a}^2+\cdots+\alpha_r\mathbf{a}^r=0,\\ \beta_1\mathbf{a}^{r+1}+\beta_2\mathbf{a}^2+\cdots+\beta_r\mathbf{a}^r=0. $$

If $\beta_1=0$, then all other betas are zero because any subset of $\{\mathbf{a}^1,\mathbf{a}^2, \dots,\mathbf{a}^r\}$ is independent, so we have $\beta_1\neq0$ and we divide the second equation by $\beta_1$. To reduce clutter, I'll write it as $$\mathbf{a}^{r+1}=\gamma_2\mathbf{a}^2+\cdots+\gamma_r\mathbf{a}^r.$$ Now plug that into the first equation and regroup to obtain: $$\alpha_1\mathbf{a}^1+(\alpha_1\gamma_2+\alpha_2)\mathbf{a}^2+\cdots+(\alpha_1\gamma_r+\alpha_r)\mathbf{a}^r=0.$$ The independence of $\{\mathbf{a}^1,\mathbf{a}^2, \dots,\mathbf{a}^r\}$ forces all these coefficients to be zero, so $\alpha_1=0$ and $$0=\alpha_1\gamma_j+\alpha_j=\alpha_j,\text{ for }j=2,\dots,r.$$ But this contradicts that at least one of the alphas must be nonzero. Therefore, at least one of $\{\mathbf{a}^1+\mathbf{a}^{r+1},\mathbf{a}^2, \dots,\mathbf{a}^r\}$ and $\{\mathbf{a}^{r+1},\mathbf{a}^2, \dots,\mathbf{a}^r\}$ is independent.

This proves that the row-rank of the matrix $B$ is at least $r$. We now prove that it is exactly $r$


Suppose first that $\{\mathbf{a}^2, \dots,\mathbf{a}^r,\mathbf{a}^{r+1}\}$ is independent in $B$ but not maximal. Then it is contained in some maximal independent set for $B$, say $$\{\mathbf{a}^2, \dots,\mathbf{a}^r,\mathbf{a}^{r+1},\mathbf{b}^1,\dots,\mathbf{b}^s\}.$$ Any subset of this maximal independent set is itself independent, so if any of the rows $\mathbf{b}^1,\dots,\mathbf{b}^s$ is also a row of $A$, it would mean that $A$ has rank larger than $r$. Therefore, there can be at most one row in that set, the only row which is not in $A$, i.e. $\mathbf{b}^1=\mathbf{a}^1+\mathbf{a}^{r+1}$.

So suppose that $$\{\mathbf{a}^1+\mathbf{a}^{r+1},\mathbf{a}^2, \dots,\mathbf{a}^r,\mathbf{a}^{r+1}\}$$ is maximal independent in $B$. Then both sets $$\{\mathbf{a}^1,\mathbf{a}^2, \dots,\mathbf{a}^r\}\text{ and}\\ \{\mathbf{a}^2, \dots,\mathbf{a}^r,\mathbf{a}^{r+1}\} $$ are maximal independent in $A$, therefore from the first set we have $$\mathbf{a}^{r+1}=\alpha_1\mathbf{a}^{1}+\dots+\alpha_r\mathbf{a}^{r}$$ and from the second set we have $$\mathbf{a}^{1}=\beta_2\mathbf{a}^{2}+\dots+\beta_{r+1}\mathbf{a}^{r+1}.$$ This means that we can write $\mathbf{a}^{1}+\mathbf{a}^{r+1}$ as $$\mathbf{a}^{1}+\mathbf{a}^{r+1}=\mathbf{a}^1+\alpha_1\mathbf{a}^1+\cdots+\alpha_r\mathbf{a}^r=\\ (\text{plug in the expression for $\mathbf{a}^1$})=\\ \gamma_2\mathbf{a}^{2}+\cdots+\gamma_{r+1}\mathbf{a}^{r+1}.$$ But if we plug that into $\{\mathbf{a}^1+\mathbf{a}^{r+1},\mathbf{a}^2, \dots,\mathbf{a}^r,\mathbf{a}^{r+1}\}$, we get a dependent set!

If $\{\mathbf{a}^1+\mathbf{a}^{r+1},\mathbf{a}^2, \dots,\mathbf{a}^r\}$ is independent in $B$ but not maximal, then again it is contained in some maximal independent set for $B$ which, for the same reasons as previously, can contain only one additional row $\mathbf{b}$, which is also a row of $A$ and such that $\{\mathbf{a}^2, \dots,\mathbf{a}^r,\mathbf{b}\}$ is independent (in $A$). Now the same reasoning applies: if the set $$\{\mathbf{a}^1+\mathbf{a}^{r+1},\mathbf{a}^2, \dots,\mathbf{a}^r,\mathbf{b}\}$$ is independent in $B$, then both sets $$\{\mathbf{a}^1,\mathbf{a}^2, \dots,\mathbf{a}^r\}\text{ and}\\ \{\mathbf{a}^2, \dots,\mathbf{a}^r,\mathbf{b}\} $$ are maximal independent in $A$, so we can use the first set to express $\mathbf{a}^{r+1}$ and the second set to express $\mathbf{a}^1$, and then combine the two expressions so that we get $$\mathbf{a}^1+\mathbf{a}^{r+1}=\delta_2\mathbf{a}^2+\cdots++\delta_{r}\mathbf{a}^{r}+\delta\mathbf{b},$$ implying that $\{\mathbf{a}^1+\mathbf{a}^{r+1},\mathbf{a}^2, \dots,\mathbf{a}^r,\mathbf{b}\}$ is dependent.

Related Question