[Math] Determining the maximum number of linearly independent rows and columns for a given matrix

linear algebramatricesmatrix equations

Here's the specific question:

For the matrix $$\begin{bmatrix}1&2 & 0 & 3\\ 1 & 2& 3 & 3 \\ 1 & 0 &
1 & 1 \\1 & 1 & 1 & 2 \end{bmatrix}$$ determine the maximum number of
linearly independent rows and the maximum number of linearly
independent columns.

I know that we say that rows/columns $\mathbf{x_1},\mathbf{x_2},\ldots\mathbf{x_p}$ are called linearly independent if the only scalars $\lambda_1, \ldots,\lambda_p$ which are such that $$\lambda_1\mathbf{x_1}+\lambda_2\mathbf{x_2}+\ldots+\lambda_n\mathbf{x_p}=\mathbf{0}$$ are $\lambda_1=\lambda_2=\ldots=\lambda_p=0$.

Okay, but that leaves me 11 possible sets of columns and rows to consider. First let us consider the columns $\mathbf{x_1},\mathbf{x_2},\mathbf{x_3}, \mathbf{x_4}$. Suppose that we have $\lambda_1\mathbf{x_1}+\lambda_2\mathbf{x_2}+\lambda_3\mathbf{x_3}+\lambda_4\mathbf{x_4}=\mathbf{0}$ for some scalars $\lambda_1,\lambda_2,\lambda_3,\lambda_4$. Then

$$\lambda_1\begin{bmatrix}1\\1\\1\\1\end{bmatrix} + \lambda_2\begin{bmatrix}2\\2\\0\\3\end{bmatrix} + \lambda_3\begin{bmatrix}0\\3\\1\\1\end{bmatrix} + \lambda_4\begin{bmatrix}3\\3\\1\\2\end{bmatrix} = \begin{bmatrix}0\\0\\0\\0\end{bmatrix}$$

and consequently
$$\begin{eqnarray}
\lambda_1 + 2\lambda_2 + 3\lambda_4 &=& 0 \\
\lambda_1 + 2\lambda_2 + 3\lambda_3 + 3\lambda_4 &=& 0 \\
\lambda_1 + \lambda_3 + \lambda_4 &=& 0 \\
\lambda_1 + \lambda_2 + \lambda_3 + 2\lambda_4 &=& 0
\end{eqnarray}$$

and clearly $\lambda_1 = 1 , \lambda_2 = 1 , \lambda_3 = 0 , \lambda_4 =-1$ is a solution. Thus, columns $\mathbf{x_1},\mathbf{x_2},\mathbf{x_3}, \mathbf{x_4}$ are not linearly independent.

Do I need to repeat this for every possible group of columns and rows? But then if we consider $3$ columns, we'll get a homogeneous system of $4$ equations with $3$ unknowns. How do I tackle this problem?

This is a question from Basic Linear Algebra – Blyth, Robertson. The authors have defined row reduced matrices and hermite matrices in the quest of a systematic method of tackling system of linear equations. Rank of matrices, Vector Spaces, Linear Mappings, Determinants are yet to be discussed.

Best Answer

Yes, in order to determine the number of linearly independent colums we need to solve the homogeneous system

$$Ax=0$$

and find the dimension of the null space of $A.$ Notably if $x=0$ is the unique solution, $\dim(N(A))=0$ and all the $n$ columns are linearly independent. Otherwise, and more in general, if $\dim(N(A))=d$, the number of linearly independent colums is $n-d$.

For the rows we need to solve

$$A^Tx=0$$

and we'll find that $\dim(N(A))=\dim(N(A^T));$ that is, the maximum number of linearly independent rows is equal to the maximum number of linearly independent columns.

That's true for any matrix and that number is defined as the rank of A.

Related Question