[Math] If a function converges at infinity, does that imply that it increases/decreases at a diminishing rate

calculusinequality

If I have a function, say $f(x)$, and I can prove that the function decreases with $x$, and that it converges to a constant when $x = \infty$, does that prove that it decreases at a decreasing rate? In other words, does $f(i+1) < f(i) \text { and } f(\infty) = c \to f(i+1) – f(i+2) < f(i)-f(i+1)$ ?
I don't see how it could be any other way, but I often overlook things.
Also, would this be the same as saying that, if I can prove the first derivative is negative and that the function converges, that the second derivative must be positive?

Best Answer

The rate of decrease will tend to zero in the limit, but that doesn't mean the rate of decrease is itself always decreasing. In other words, $f\to c$ implies $\Delta f\to0$ (the forward difference) and monotonicity ensures $\Delta f<0$, but it's still possible for $\Delta^2 f $ (the second difference) to alternate sign.

Here's a graphic to visualize how it's possible (on a local scale): mspaint

In the above, the black dots represent a monotone decreasing sequence (which we'll say converges to something). We put in between each black term a green term: each green term is slightly lower than the previous black dot, and slightly higher than the next black dot (I put red and blue lines in to make this more apparent), so the green and black dots together create a new monotone decreasing sequence. However, look at the slopes of the orange and purple lines: the slopes get smaller and bigger and smaller and bigger and so on, alternating. But the slopes represent the forward difference $\Delta a_n = a_{n+1}-a_n$, so this means second forward difference $\Delta^2a_n$ is changing sign!

Related Question