I would say that the textbook's proof is better because it proves what needs to be proven without using facts about row-operations along the way. To see that this is the case, it may help to write out all of the definitions at work here, and all the facts that get used along the way.
Definitions:
- $A$ is invertible if there exists a matrix $A^{-1}$ such that $AA^{-1} = A^{-1}A = I$
- The vectors $v_1,\dots,v_n$ are linearly independent if the only solution to $x_1v_1 + \cdots + x_n v_n = 0$ (with $x_i \in \Bbb R$) is $x_1 = \cdots = x_n = 0$.
Textbook Proof:
Fact: With $v_1,\dots,v_n$ referring to the columns of $A$, the equation $x_1v_1 + \cdots + x_n v_n = 0$ can be rewritten as $Ax = 0$. (This is true by definition of matrix multiplication)
Now, suppose that $A$ is invertible. We want to show that the only solution to $Ax = 0$ is $x = 0$ (and by the above fact, we'll have proven the statement).
Multiplying both sides by $A^{-1}$ gives us
$$
Ax = 0 \implies A^{-1}Ax = A^{-1}0 \implies x = 0
$$
So, we may indeed state that the only $x$ with $Ax = 0$ is the vector $x = 0$.
Your Proof:
Fact: With $v_1,\dots,v_n$ referring to the columns of $A$, the equation $x_1v_1 + \cdots + x_n v_n = 0$ can be rewritten as $Ax = 0$. (This is true by definition of matrix multiplication)
Fact: If $A$ is invertible, then $A$ is row-equivalent to the identity matrix.
Fact: If $R$ is the row-reduced version of $A$, then $R$ and $A$ have the same nullspace. That is, $Rx = 0$ and $Ax = 0$ have the same solutions
From the above facts, we conclude that if $A$ is invertible, then $A$ is row-equivalent to $I$. Since the columns of $I$ are linearly independent, the columns of $A$ must be linearly independent.
For sake of contradiction, suppose there is a set of vectors $\{u_1, \ldots, u_{n+1}\}$ in an $n$-dimensional vector space $V$ such that $\{u_1, \ldots, u_{n+1}\}$ are linearly independent. Let $B = \{v_1, \ldots, v_n\}$ be a basis for $V$. Then $B$ spans $V$, and we can write
$$
u_1 = a_1v_1 + \cdots + a_nv_n
$$
Since $\{u_1, \ldots, u_{n+1}\}$ is linearly independent, it follows that no $u_i$ can be zero. This implies that there is at least one $j$ such that $a_j \neq 0$. Without loss of generality, assume that $j = 1$. Then we may write
$$
\begin{aligned}
v_1 & = \frac{1}{a_1}u_1 - \frac{a_2}{a_1}v_2 - \cdots - \frac{a_n}{a_1}v_n & (1)
\end{aligned}
$$
Now let $B_1 = \{u_1, v_2, \ldots, v_n\}$. Since $B$ spans $V$, we can write
$$
v = \lambda_1v_1 + \cdots + \lambda_nv_n
$$
for any $v\in V$. By $(1)$, we have
$$
\begin{align}
v & = \lambda_1(\frac{1}{a_1}u_1 - \frac{a_2}{a_1}v_2 - \cdots - \frac{a_n}{a_1}v_n) + \lambda_2v_2 + \cdots + \lambda_nv_n \\
& = \frac{\lambda_1}{a_1}u_1 + \frac{-\lambda_1 a_2}{a_1}v_2 + \lambda_2v_2 + \cdots + \frac{-\lambda_1 a_n}{a_1}v_n + \lambda_nv_n \\
& = \frac{\lambda_1}{a_1}u_1 + (\frac{-\lambda_1 a_2}{a_1} + \lambda_2)v_2 + \cdots + (\frac{-\lambda_1 a_n}{a_1} + \lambda_n)v_n \\
& = \lambda_1^{'}u_1 + \lambda_2^{'}v_2 + \cdots + \lambda_n^{'}v_n \\
\end{align}
$$
Thus, we can write any $v \in V$ in terms of the elements of $B_1$, which means $B_1$ spans $V$.
Suppose we have obtained $B_{i - 1} = \{u_1,\ldots, u_{i-1},v_i,\ldots, v_n\}$ and have shown that it spans $V$. Then we may write
$$
u_i = a_1u_1 + \cdots + a_{i-1}u_{i-1} + a_iv_i + \cdots + a_nv_n
$$
for some $a_1, \ldots, a_n \in \mathbb{R}$. Since $u_i$ is a non-zero vector, there must be a $k$ such that $a_k \neq 0$. Let $j$ be the largest index for which $a_j \neq 0$. This $j$ must satisfy $j \geq i$, for if $j < i$, then $a_i = a_{i+1} = \ldots = a_n = 0$, which implies that
$$
u_i = a_1u_1 + \cdots + a_{i-1}u_{i-1}
$$
which contradicts that $\{u_1, \ldots, u_{n+1}\}$ is linearly independent. Without loss of generality, assume $j = i$. Then we can switch $u_i$ with $v_i$ in $B_{i-1}$ to obtain $B_i = \{u_1,\ldots, u_{i-1}, u_i,v_{i+1},\ldots, v_n\}$, which can be shown to span $V$ by a substitution similar to the one using $(1)$ above.
Continue to do this until the $n$th step, where $B_n = \{u_1, u_2, \ldots, u_n\}$. Previously, we showed that if $B_{i-1}$ spans $V$, then our operation of switching $u_i$ with $v_i$ to get $B_i$ also makes $B_i$ span $V$. Therefore, $B_n$ must span $V$ (by induction starting with $B_1$). Since $u_{n+1}$ is in $V$, and since $B_n$ spans $V$, we can write
$$
u_{n+1} = a_1u_1 + \cdots + a_nu_n
$$
for some $a_1, \ldots, a_n \in \mathbb{R}$. But this contradicts that $\{u_1, \ldots, u_{n+1}\}$ is linearly independent. Therefore, it must not be the case that $\{u_1, \ldots, u_{n+1}\}$ is linearly independent, which is to say that $\{u_1, \ldots, u_{n+1}\}$ must be linearly dependent. $$\tag*{$\blacksquare$}$$
Best Answer
As stated in the comments, this does not hold in general, but holds if $A$ is a real matrix.
Claim 1: $ker(A) = ker(A^TA)$
$\subseteq$ is clear. To prove the other, let $v \in ker(A^TA)$. Then $0 = \langle v, A^TAv \rangle = ||Av||^2 \iff Av = 0 \iff v \in ker(A)$.
Claim 2: matrix $A$ is invertible iff injective (as a linear map given bases in domain and codomain). Note that for a linear map, or a matrix, $A$, it is injective iff $ker(A) = \{0\}$, i.e. trivial.
So since $ker(A)$ is trivial as A has independent column vectors, $A^TA$ is invertible by claims 1 and 2.