$3$ vectors are, and only are linearly independent, if only $0u+0v+0w=0$(null vector) stands, and you can't make the null vector from any of their other combinations. Since $2u+3v-w=(8,6,-4)+(6,-18,21)-(14,-12,17)=(0,0,0)$, they are linearly dependent. Your approach is totally correct, your book must be false.
It is true that two vectors are dependent if they "point in the same (or opposite) direction", i.e. if they are aligned.
But that is not totally true for three vectors in $3$D or more.
In the sense that, when the three vectors are aligned, i.e. parallel, i.e. when they are scalar multiples of each other, they are for sure dependent.
But the definition of linear dependency of three vectors is wider than being parallel: it includes also the case in which they are co-planar, although not parallel.
If you want to see that geometrically, taking the three vectors as position vectors from the origin, if they define a full $3$D parallelepiped then they are independent, if instead the parallelepiped collapses into a flat figure or segment then the vectors are dependent.
Algebraically this translates into the fact whether the matrix formed by the three vectors has full rank ($3$) or less.
Similarly for $n$ vectors of $m$ dimensions.
Then from the theory of linear system you know that, in a homogeneous system, if the matrix has full rank then it has the only solution $(0,0, \cdots, 0)$ which corresponds to the combination coefficients to be all null.
In reply to your comment, in ${\mathbb R}^2$ if you have two non-aligned = independent vectors, then a third one will lie on their same plane (the $x,y$ plane).
In the geometric interpretation, the parallelepiped (the hull) will be flat, i.e. dimension 2, which is less than 3, the number of vectors.
In the algebraic interpretation, a matrix $3 \times 2$ cannot have a rank greater than two: so 3 (or more) 2D vectors are necessarily dependent.
final note (to clarify what might be the source of your confusion)
The (in)dependence of $n$ vectors in ${\mathbb R}^m$ is defined for the whole set of $n$ vectors: they might be dependent, notwithstanding that a few of them ($q<n, \; q\le m$) could be independent. Yet if one is dependent on another (or other two, etc.), then the whole set is dependent.
And in fact it is a common task, given $n$ vectors, to find which among them represent an independent subset: the minor in the matrix with non-null determinant, the larger giving the rank.
Best Answer
Since the domain of the function is not specified I will take it as the whole real line.
Suppose $ae^{-x}+b(-2e^{2x}+5e^{-x})=0$. Divide throughout by $e^{2x}$ and take limit as $ x \to \infty$. You get $b=0$ from which you can see that $a$ must also be $0$. Hence, the first see is linearly independent.
Suppose $ax|x|+bx^{2}=0$. Put $x=1$ to get $a+b=0$ and $x=-1$ to get $-a+b=0$. Use these two equations to show that $a=b=0$. So the second set is also linearly independent.