The idea to approach a question like yours, is that $f$ must oscillate heavily as we approach the tail, but the amplitude must be steadily decreasing to zero. Essentially, try to make $f$ a product of two terms : one curbing the oscillation, ensuring the function goes to zero, and the other increasing the frequency of oscillation (keeping amplitude constant), ensuring the derivative goes to infinity.
For example, taking $\frac 1x$, which goes to infinity, and $\sin (x^2)$, which oscillates rapidly, satisfy these conditions (note that $\sin x$ also oscillates, but $\frac 1x$ doesn't just go to infinity : it also provides a dampening effect to oscillations, since its derivative is $x$ to a lower power. To counter this lower power, we need more rapid oscillations). You can check that $\frac 1x \sin(x^2)$ is a counterexample. So is $\frac 1x \sin x^3$.
For fun, try to come up with conditions on $f$ and $g$ so that $h(x) = f(x) \sin g(x)$ is a counterexample to your assertion.
As you have noted , $f''$ being bounded implies that if $f(x) \to 0$ then $f'(x) \to 0$ as $x \to \infty$. However, there is a more general condition, which is weaker than $f''$ being bounded. It is the uniform continuity of $f'$ : if $f'$ is assumed uniformly continuous, rather than differentiable with bounded derivative, then too it is true that it would converge to $0$.You can try this as an exercise.
Answer to the exercise
Known as Barbalat's lemma, is the statement that if $f$ is differentiable on $(a,\infty)$, continuous on $[a,\infty)$ , and $f'$ is uniformly continuous on $(a,\infty)$, then $\lim_{x \to \infty} f(x) = a < \infty$ implies that $\lim_{x \to \infty} f'(x) = 0$. Note that we actually have $a= 0$, but the value of $a$ does not matter, because one may add to , or subtract a constant from $f$ to make the value of $\lim_{x \to \infty}f(x)$ change : the derivative removes constants, so it will not change.
We prove this by contradiction. Suppose that $\lim_{x\ to \infty} f'(x) \neq 0$.We negate the definition of limit equalling zero, to get : there exists $\epsilon > 0$, for all $r$ there exists $x > r$ such that $|f'(x)| > \epsilon$.
Now, take $r = 1,2,...$ in this statement, to get points $x_i$ at which $|f'| > \epsilon$. There will be infinitely many such points. This implies that either the set of points for which $f' < -\epsilon$, or the set of points for which $f' > \epsilon$,(or both) will be an infinite set. Without loss of generality, let us assume that $f'(x_i) > \epsilon$ for all $i$.
Now, we will see what happens if $f'$ is just continuous.
Since $f'$ is continuous at each $x_i$, there exists $\delta_i$ depending on $x_i$, such that $|y - x_i| < \delta_i$ implies $f'(y) > \epsilon$.
Now, what happens under uniform continuity?
Since $f'$ is uniformly continuous, there exists $\delta$ not depending on $x_i$, such that $|y - x_i| < \delta$ implies $f'(y) > \epsilon$.
Ok, so what extra is uniform continuity giving us? Not clear so far.
Consider the quantity $D_i = \int_{a}^{x_i + \delta} f' - \int_{a}^{x_i} f' = \int_{x_i}^{x_i + \delta} f'$. Since $f' > \epsilon$ on these intervals, we see that $D_i > \epsilon \delta$ for all $i$.
However, from the fundamental theorem of calculus, we know that $D_i = f(x_i + \delta) - f(x_i)$. Therefore, since $\lim_{x \to \infty} f(x) = \alpha$, the limit of the RHS exists and equals zero. Consequently , $\lim_{i \to \infty} D_i = 0$. But this can't happen : $D_i > \epsilon \delta$, so it can't get closer than this to zero! Contradiction.
What happens if we change back to just continuity? The problem is that $D_i$ is now $\int_{x_i}^{x_i + \delta_i} f'$. The bound created is $D_i >\epsilon\delta_i$. Since $\delta_i$ is not fixed, this does not prevent $D_i$ from converging to zero! Which was the trick to producing the contradiction.
Counterexample. Let $f(x) = \frac 1x \sin(x^2)$. Then $f(x) \to 0$ as $x \to \infty$, but
$$
f'(x) = - \frac{1}{x^2} \sin(x^2) + \frac 1x \cos(x^2) \cdot 2x
= - \frac{1}{x^2} \sin(x^2) + 2 \cos(x^2),
$$
which is not convergent to $0$ as $x \to \infty$. The problem here lies of course in the second derivative, which is not bounded.
Problem with your reasoning. For each $x > 0$, you find just one point $c_x \in [x,x+a]$ for which $f'(c_x)$ is small. And indeed, if you fix any sequence $x_n \to \infty$, your construction yields a corresponding sequence $c_n \in [x_n,x_n+a]$ that satisfies $c_n \to \infty$ and $f'(c_n) \to 0$. But there is no reason why every sequence $y_n \to \infty$ should be obtained as $c_n$ in this way.
In a way, this may be surprising. Your family of $c_x$ is indexed by all points $x > 0$, and the intuition may tell us that those $c_x$ should cover at least the whole halfline $[a,\infty)$. But this intuition is wrong! It would be true if $x \mapsto c_x$ was a continuous function, while it is not. The best way to see this is by studying some counterexample.
How to solve this. As you noted yourself, this is the easy part. This question was already asked and answered, e.g. here.
Best Answer
Isn't $-e^{-x}$ such an example?