Given $A\in\mathbb{R}^{m\times n}$, $m\geq n$, compute the (economy) QR factorisation. This gives
$$
A = QR, \quad R\in\mathbb{R}^{n\times n}.
$$
Now if $\mathrm{rank}(A)<n$, the upper triangular matrix $R$ has a staircase profile with some of the "steps" of the staircase over more than one column. Select column indices $j_1,\ldots,j_k$ such that if you remove these columns from $R$, you obtain a nonsingular upper triangular matrix (you can consider it as making each step of the staircase of length 1). The columns $j_1,\ldots,j_k$ can be expressed as linear combination of the remaining columns.
Example: The red columns indicate the columns which are linear combinations of the others.
$$
\begin{bmatrix}
\times & \times & \color{red}\times & \times & \color{red}\times & \color{red}\times \\
0 & \times & \color{red}\times & \times & \color{red}\times & \color{red}\times \\
0 & 0 & \color{red}0 & \times & \color{red}\times & \color{red}\times
\end{bmatrix}
$$
Example: For the given matrix from the question, the QR factorisation is:
Q =
0 -0.4472 -0.8944
0 -0.8944 0.4472
-1.0000 0 0
R =
-1.0000 2.0000 -1.0000
0 4.4721 -2.2361
0 0 0
So one can pick the column 2 or 3 to make the matrix $R$ nonsingular and upper triangular (hence either the column 2 or 3 is a linear combination of the others).
I would say that the textbook's proof is better because it proves what needs to be proven without using facts about row-operations along the way. To see that this is the case, it may help to write out all of the definitions at work here, and all the facts that get used along the way.
Definitions:
- $A$ is invertible if there exists a matrix $A^{-1}$ such that $AA^{-1} = A^{-1}A = I$
- The vectors $v_1,\dots,v_n$ are linearly independent if the only solution to $x_1v_1 + \cdots + x_n v_n = 0$ (with $x_i \in \Bbb R$) is $x_1 = \cdots = x_n = 0$.
Textbook Proof:
Fact: With $v_1,\dots,v_n$ referring to the columns of $A$, the equation $x_1v_1 + \cdots + x_n v_n = 0$ can be rewritten as $Ax = 0$. (This is true by definition of matrix multiplication)
Now, suppose that $A$ is invertible. We want to show that the only solution to $Ax = 0$ is $x = 0$ (and by the above fact, we'll have proven the statement).
Multiplying both sides by $A^{-1}$ gives us
$$
Ax = 0 \implies A^{-1}Ax = A^{-1}0 \implies x = 0
$$
So, we may indeed state that the only $x$ with $Ax = 0$ is the vector $x = 0$.
Your Proof:
Fact: With $v_1,\dots,v_n$ referring to the columns of $A$, the equation $x_1v_1 + \cdots + x_n v_n = 0$ can be rewritten as $Ax = 0$. (This is true by definition of matrix multiplication)
Fact: If $A$ is invertible, then $A$ is row-equivalent to the identity matrix.
Fact: If $R$ is the row-reduced version of $A$, then $R$ and $A$ have the same nullspace. That is, $Rx = 0$ and $Ax = 0$ have the same solutions
From the above facts, we conclude that if $A$ is invertible, then $A$ is row-equivalent to $I$. Since the columns of $I$ are linearly independent, the columns of $A$ must be linearly independent.
Best Answer
Suppose $A$ is an $m \times n$ matrix with linearly independent columns. Let $L_A$ be the linear transformation defined by $L_A(x) = Ax$. Because the columns of $A$ are linearly independent, the null space of $L_A$ is trivial. Hence, $L_A$ is one-to-one. It follows that $L_A$ has a left inverse $K:R(A) \to \mathbb R^n$. Although the linear transformation $K$ is only defined on a subspace of $\mathbb R^m$, it can be extended to a linear transformation $T:\mathbb R^m \to \mathbb R^n$. This linear transformation $T$ also satisfies $$ T \circ L_A = I $$ where $I$ is the identity transformation on $\mathbb R^n$. Finally, if $M$ is the matrix representation of $T$ (with respect to the standard bases of $\mathbb R^m$ and $\mathbb R^n$) then $$ M A = I. $$