Linear Algebra – Properties of a Matrix with Dependent Row Vectors

linear algebramatricesvector-spaces

When a column vector in a matrix is a made up of "combination" of its other column vectors, it is said to be linearly dependant. Say…

$$
A=\begin{bmatrix}
2 & 1 & 0\\
4 & 5 & -6\\
3 & 1 & 1
\end{bmatrix}
$$
$$
1\begin{bmatrix}
2\\
4\\
3
\end{bmatrix}-2\begin{bmatrix}
1\\
5\\
1
\end{bmatrix}=\begin{bmatrix}
0\\
-6\\
1
\end{bmatrix}

$$

Otherwise, it is linearly independent. And being linearly dependent, it has the properties of being a singular matrix and therefore may have an infinite solutions or no solutions at all depending on the result matrix. Then being linearly independent, the matrix is more often a good matrix that can span the entire $R^{n}$ space and has a unique solution to its system of equations.

Then I just thinking what happens if a row vector in a matrix is made up of "combination" of its other row vectors? Say…

$$P=\begin{bmatrix}
2 & 5 & 1\\
12 & 13 & 3\\
8 & 3 & 1
\end{bmatrix}
$$
$$
2\begin{bmatrix}
2 & 5 & 1
\end{bmatrix}+1\begin{bmatrix}
8 & 3 & 1
\end{bmatrix}=\begin{bmatrix}
12 & 13 & 3
\end{bmatrix}
$$

Does a matrix have any special properties too if its row vectors are linearly independent and linearly dependent?

Thanks!

Best Answer

Yes. The following are equivalent for a square matrix $A$:

  1. $A$ is non-singular

  2. the rows of $A$ are linearly independent

  3. the columns of $A$ are linearly independent

Related Question