Why are the colums of
$\begin{bmatrix}1&0&0&0\\0&1&0&0\\0&0&1&0\end{bmatrix}$
linearly dependent?
Because there exists non-zero $x$ such that
$\begin{bmatrix}1&0&0&0\\0&1&0&0\\0&0&1&0\end{bmatrix} x = 0$
i.e.
$\begin{bmatrix}1&0&0&0\\0&1&0&0\\0&0&1&0\end{bmatrix}\begin{bmatrix}
0\\0\\0\\1\end{bmatrix} = \begin{bmatrix} 0\\0\\0\end{bmatrix}$
How do you prove that any $3\times4$ matrix has linearly dependent columns?
Suppose the columns of your matrix are $\mathbf v_1,\mathbf v_2,\mathbf v_3,\mathbf v_4.$ And suppose that $\mathbf v_1,\mathbf v_2,\mathbf v_3$ are linearly independent. Then we want to show that there exists and $a,b,c$ such that $a\mathbf v_1 + b\mathbf v_2 + c\mathbf v_3 = \mathbf v_4$
How to do that? It might help to show that there exists $a_1,b_1,c_1$ such that:
$a_1\mathbf v_1 + b_1\mathbf v_2 + c_1\mathbf v_3 = \begin{bmatrix} 1\\0\\0\end{bmatrix}$
and similarly there is $a_2,b_2, c_2$ and $a_3, b_3, c_3$ such that
$a_2\mathbf v_1 + b_2\mathbf v_2 + c_2\mathbf v_3 = \begin{bmatrix} 0\\1\\0\end{bmatrix}$
and
$a_3\mathbf v_1 + b_3\mathbf v_2 + c_3\mathbf v_3 = \begin{bmatrix} 0\\0\\1\end{bmatrix}$
And certainly $\mathbf v_4$ can be composed as a combintation of $\begin{bmatrix} 1\\0\\0\end{bmatrix}, \begin{bmatrix} 0\\1\\0\end{bmatrix},\begin{bmatrix} 0\\0\\1\end{bmatrix}$
I would say that the textbook's proof is better because it proves what needs to be proven without using facts about row-operations along the way. To see that this is the case, it may help to write out all of the definitions at work here, and all the facts that get used along the way.
Definitions:
- $A$ is invertible if there exists a matrix $A^{-1}$ such that $AA^{-1} = A^{-1}A = I$
- The vectors $v_1,\dots,v_n$ are linearly independent if the only solution to $x_1v_1 + \cdots + x_n v_n = 0$ (with $x_i \in \Bbb R$) is $x_1 = \cdots = x_n = 0$.
Textbook Proof:
Fact: With $v_1,\dots,v_n$ referring to the columns of $A$, the equation $x_1v_1 + \cdots + x_n v_n = 0$ can be rewritten as $Ax = 0$. (This is true by definition of matrix multiplication)
Now, suppose that $A$ is invertible. We want to show that the only solution to $Ax = 0$ is $x = 0$ (and by the above fact, we'll have proven the statement).
Multiplying both sides by $A^{-1}$ gives us
$$
Ax = 0 \implies A^{-1}Ax = A^{-1}0 \implies x = 0
$$
So, we may indeed state that the only $x$ with $Ax = 0$ is the vector $x = 0$.
Your Proof:
Fact: With $v_1,\dots,v_n$ referring to the columns of $A$, the equation $x_1v_1 + \cdots + x_n v_n = 0$ can be rewritten as $Ax = 0$. (This is true by definition of matrix multiplication)
Fact: If $A$ is invertible, then $A$ is row-equivalent to the identity matrix.
Fact: If $R$ is the row-reduced version of $A$, then $R$ and $A$ have the same nullspace. That is, $Rx = 0$ and $Ax = 0$ have the same solutions
From the above facts, we conclude that if $A$ is invertible, then $A$ is row-equivalent to $I$. Since the columns of $I$ are linearly independent, the columns of $A$ must be linearly independent.
Best Answer
$$A\mathbf{x}= 0$$ $$\begin{bmatrix}a_{1,1}&a_{1,2} &... &a_{1,m}\\a_{2,1} &&& a_{2,m} \\ : &&&: \\ a_{n,1} & a_{n,2}& ...& a_{n,m}\end{bmatrix} \begin{bmatrix}x_1\\x_2\\:\\x_m\end{bmatrix} = 0_v$$ Let $\mathbf{a}_j$ be the $j$th column of $A$ and $0_v$ the zero vector. Hence, for all $c_j \ne 0$, $\sum_{j=1}^{n}c_j\mathbf{a}_{j} \ne 0_v$ since the columns are independent. Computing the product above yields, $$x_1\begin{bmatrix} a_{1,1}\\:\\a_{,1} \end{bmatrix} + x_2\begin{bmatrix} a_{1,2}\\:\\a_{n,2} \end{bmatrix} ... x_m\begin{bmatrix} a_{1,m}\\:\\a_{n,m} \end{bmatrix} =0_v$$ Because $\sum_{j=1}^{m}c_j\mathbf{a}_{j} \ne 0_v$ for all $c_j \ne 0$ that means the only possible values for $x_i$ is $0$ for all $x_i$. Therefore the only solution is the trivial solution where $\mathbf{x} = 0_v$