[Math] Invertability of submatrix

linear algebralinear programmingmatrices

If I have a matrix $A \in R^{(m \times n)}$ with $m \leq n$. All rows in matrix a are linearly independent and therefore $A$ has a full row rank. I can decompose matrix $A$ such that $A = [B|N]$ with $B \in R^{(m \times m)}$ and $N \in R^{(m \times (n-m))}$

How can the following statement in Griffin, C; p70 be justified, "we know that $B$ is invertible since $A$ has a full row rank". $B$ is invertible as comes from having been constructed from $m$ linearly independent columns Griffin, C; p45. Would it be fair to say that, if a $A$ has full row rank, there will therefore always exist some matrix $B$ such that $A=[B|A]$ with $B$ having linearly independent columns and therefore being invertible?

Best Answer

Yes that is always true. when $m\leq n$ and $A\in\mathbb{R}^{m \times n}$ has linearly independent rows.

To be more precise...

The column rank of a matrix A is the maximum number of linearly independent column vectors of A.

The row rank of a matrix A is the maximum number of linearly independent row vectors of A.

Equivalently, the column rank of A is the dimension of the column space of A, while the row rank of A is the dimension of the row space of A.

A result of fundamental importance in linear algebra is that the column rank and the row rank are always equal

Since your matrix has row rank m, you can always find m linearly independent columns. Those columns will compose matrix B you're looking for.

For a more detailed proof check: wikipedia rank of matrix

Related Question