Here's an intuitive way of thinking about the problem.
(1) The $x^2$ on the outside causes the function to vanish rapidly, but the $1/x^2$ inside the sine function causes the oscillation to be similarly rapid. This balance turns out to be just enough to produce unbounded variation, as the variation behaves similarly to the harmonic series. How so?
It will suffice to consider the interval $[0,1]$. $\sin(1/{x}^2)= 0$ when
\begin{equation}x = \frac{1}{\sqrt{\pi n}}\end{equation}
and $\sin(1/x^2) = 1$ when
\begin{equation}x = \frac{\sqrt{2/\pi}}{\sqrt{4n+1}}\end{equation}
(in both cases, make sure that the denominators are not zero)
Now, "throw" these points into a partition. In other words, create a sequence of partitions containing these values for greater and greater $n$. Now, compute the variation. For the points of the form
\begin{equation}x = \frac{1}{\sqrt{\pi n}}\end{equation}
the entire expression vanishes. Thus the variation just becomes the summation of $x^2$ at points of the form
\begin{equation}x = \frac{\sqrt{2/\pi}}{\sqrt{4n+1}}\end{equation}
which is
\begin{equation}x = \sum_{n=n_{0}}^{k} \frac{2/\pi}{4n+1}\end{equation}
which, like the harmonic series, diverges as $k \to \infty$.
(2) In this case, the function vanishes at a speed faster than which it oscillates. This will give us bounded variation, in the form similar to that of the convergent sum $\sum 1/n^2$.
It will suffice to consider the interval $[0,1]$, as the mirror case is identical.
$\sin(1/x)= 0$ when
\begin{equation}x = \frac{1}{\pi n}\end{equation}
and $\sin(1/x) = 1$ when
\begin{equation}x = \frac{2}{\pi (4n+1)}\end{equation}
again making sure that the denominator is nonzero. Using the same technique as before, we construct a sequence of partitions where the $\sin(1/x)$ term either vanishes or equals one. The variation (of a particular partition in the sequence) is then the following sum
\begin{equation}x = \sum_{n=n_{0}}^{k} \frac{4}{\pi^2 (4n+1)^2}\end{equation}
which converges as $k \to \infty$ like $\sum 1/n^2$.
This technique can be extended to the "general" case of $x^{k}\sin(1/x^{n})$ very easily, and provides an interesting parallel between the vanishing/oscillation speeds of this function, and summations of the form $\sum 1/x^{m}$.
The cool point of the problem is that although for any positive value $M$ and positive $\epsilon$ we can find some value of $x$ with $|x|<\epsilon$ such that $|f'(x)| > M$, yet $f$ is differentiable at $x=0$.
In fact, $f'(0) = 0$, and we can see that by applying the definition of derivative:
$$
\left. \frac{df(x)}{dx} \right|_{x=0} = \lim_{h\rightarrow 0} \frac{h^2 \sin\frac{1}{h^2}-f(0)}{h} = \lim_{h\rightarrow 0} {h \sin\frac{1}{h^2}}-0 =0
$$
since $|\sin\frac{1}{h^2}|$ is bounded by 1 and that is being multiplied by $h$.
On the other hand, look at the expression for $f'(x)$ when $x \neq 0$. At points where $\frac{1}{x^2} = (n+\frac{1}{2})\pi$,
$$
|f'(x)| > \frac{2}{x} - 2x
$$
and this is unbounded as $x$ approaches zero.
Best Answer
The term "finite" in this case is stressing the fact that it is "defined", "well defined" or "exists as a real number".
To better understand why this is important, consider the following naive attempt to give an example of a function with unbounded derivative on a closed interval:
$$f(x) = \sqrt{x}$$
The derivative, which is $$\dfrac{1}{2\sqrt{x}}$$ is clearly unbounded in $(0,1]$ but it is not even defined in zero.
The goal of the book is to disallow such examples. It technically could say "defined and unbounded", or even simply "unbounded", but to avoid any ambiguity, it is usual to say "finite and unbounded" here. Some might argue that the derivative is "defined as $\infty$" in zero (as opposed to "utterly undefined" when for example the function is not even continuous at the point), and to avoid any ambiguity the usual wording is "finite" (i.e. it is defined as a real number).
This is actually why that example in your book is interesting. By ruling out the naive examples such as $\sqrt{x}$, one might wonder if there could be any function with unbounded derivative in a closed interval.
I remember perfectly that when my teacher asked this in class, I thought that if such example would exist, surely it would involve a function that gets steeper and steeper, and that it would "end up being infinite in a closed interval" and I answered that it was impossible.
But guess what, it's possible, as the example in your book shows.