[Math] Equivalent statements regarding a square matrix

linear algebramatrices

Today in my lecture, the professor wrote that:

The following statements are equivalent for an $n\times n$ square matrix $A$:

  1. $A$ is invertible
  2. $\mathrm{rank}(A)=n$
  3. The system $AX=0$ only has the trivial solution
  4. The system $AX=B$ has a unique solution only.

EDIT : this is introductory linear algebra. We have only covered systems of linear equations and inverse matrices thus far.
I have no idea how any of these statements aren’t true, and how they are equivalent to one another.

Best Answer

We'll show the equivalence by showing $1\implies 4 \implies 3 \implies 2 \implies 1$.

  1. ($1 \implies 4$): If $A$ is invertible, the $A^{-1}$ is well-defined. So we can multiply on both sides of the equation in statement $4$ to show that $X = A^{-1}B$, the unique solution.
  2. ($4 \implies 3$): We can see by inspection that $X = 0$ is a solution to $AX = 0$ for any $A$, and by assumption, $AX = B$ has a unique solution for any $B$. Hence, the only solution to $AX = 0$ must be $X = 0$.
  3. ($3 \implies 2$) You might have heard of the rank-nullity theorem: $$\text{rank}(A) + \text{null}(A) = n$$ for any matrix with $n$ columns. If the only solution to $AX = 0$ is $X = 0$, then by definition the nullity of $A$ is zero. The rank-nullity theorem then implies that $\text{rank}(A) = n$. I realize that this explanation is a little unsatisfying, so I'll try to intuitively describe what the rank-nullity theorem is saying. Think of any $n \times m$ matrix as a linear transformation (a mapping) from $\mathbb{R}^m \to \mathbb{R}^n$. Think of every point in the domain $\mathbb{R}^m$ as a vector to be mapped. One of the most important insights in linear algebra is that the multiplication $AX$ returns a linear combination of $A$'s columns. The rank of a matrix is just the dimension of the column space, so all this is saying is for every set of $m$ linearly independent vectors $\{X_1, X_2, \cdots \}$ that you could pass as input, either $AX_i = 0$ ($X_i$ is in the nullspace) or $AX_i \neq 0$ ($X$ maps to a nontrivial element of the column space).

  4. ($2 \implies 1$) If $\text{rank}(A) = n$, then it can be reduced by a series of elementary row operations $E_1, E_2, \cdots, E_k$ to the identity matrix. We know that all elementary row operations must be invertible. But $$E_k E_{k-1} \cdots E_1 A = I \implies E_1^{-1} E_2^{-1} \cdots E_k^{-1}= A$$ and one can check that the matrix $B = E_k E_{k-1} \cdots E_1$ has the property that $BA = I$ and $AB = I$, so we have found $A^{-1} = B$.

Related Question