[Math] How do derivatives describe asymptotes

calculus

While delivering pizzas at work today I started thinking of how I could develope a good method of fitting data points to a function/curve. I came up with a good method that would work for creating simple polynomials such as $$f(x) = x^3+x^2+x+1$$ but quickly realized this method would entirely fail if the data could fit a function like $\sqrt x$, since it has infinite derivatives.

This really got me thinking about how a function could even have infinite derivatives, i.e. how the slope of a function could be changing at rate that’s changing at a rate that’s changing at a rate… and so on to infinite. Then it dawned on me that this is why asymptotes occur (and after thinking about this all day, I literally just realized that $\sqrt x$ doesn’t even have an asymptote… I think I need some sleep… but still, I’m confused)!

At first, since I was just working this all out in my head and was also focusing on the road, I mistakenly calculated that every derivative of $\sqrt x$ is negative, except for the first one which is positive. I thought this made perfect sense in terms of creating asymptotes but then my friend pointed out to me that the derivatives actually alternate between being negative and positive, and claimed that THATS why it’s an asymptote. This also made perfect sense to me and so concluded that the curve of $\sqrt x$ has an asymptote because each derivative is causing the derivative right before it to approach 0 and thus the first derivative/ the slope of the curve must also be approaching 0.

Despite that, the concept of a function having an asymptote when the first derivative is positive and the rest are negative, also made perfect sense to me, because I figured that each negative derivative would sort of “pull” the original function down, but won’t pull it past a horizontal orientation since the first derivative is positive and can only approach 0.

Cleary, I’m just totally lost right now. So finally, here are my main questions:

  1. After realizing now that $\sqrt x$ doesn’t even have an asymptote, where is the flaw in the logic that each derivative makes the one before it approach 0 and thus an asymptote occurs.

  2. If that logic isn’t what causes an asymptote, what is (in terms of a function’s derivatives)?

  3. If, in the case of my original flawed logic (where a theoretical function has a positive 1st derivative, and infinite negative derivatives after that), would this function just be a straight horizontal line? I mean if every derivative after the first is negative, then wouldn’t the second derivative approach negative infinity after any infinitesimal increase of x? And thus wouldn’t the first derivative approach zero, causing a line that’s essentially completely horizontal, yet somehow slightly tilted?

Sorry in advance for the long and confusing post, but I’d really appreciate any information that could help me build a better overall understanding of functions, their graphs, their derivatives, and even just the concept of infinity.

Thanks 🙂

Best Answer

An asymptote (horizontal or vertical) occurs when a line fits the curve at infinity.

$$\lim_{x\to\infty}(f(x)-(ax+b))=0.$$

The angular coefficient can be found by writing

$$\lim_{x\to\infty}\frac{f(x)-(ax+b)}x=0,$$

or

$$\lim_{x\to\infty}\frac{f(x)}x=a,$$ if that limit exists, and the intercept from

$$\lim_{x\to\infty}(f(x)-ax)=b,$$ if that limit exists.

The first limit can also be evaluated by the L'Hospital rule (provided its conditions of application are fulfilled):

$$a=\lim_{x\to\infty}\frac{f(x)}x=\lim_{x\to\infty}\frac{f'(x)}1.$$

This is how the first derivative enters into play. For this reason, one also says that an asymptote is a tangent at infinity. In other words, there is an asymptote if the tangent to the curve tends to a particular straight line for $x$ going to infinity.

$a$ exists if the function doesn't grow faster than $x$, i.e. if the slope of the function is bounded and convergent (to zero in the horizontal case). This is a necessary but not sufficient condition. Then $b$ exists if the difference between the function and the approximation $f(x)-ax$ is itself bounded and convergent.


A few examples:

$f(x)=x^2$ has no asymptote because $\dfrac{x^2}x$ is unbounded (so is $f'(x)=2x$).

$f(x)=x\sin(x)$ has no asymptote because $\dfrac{x\sin x}x$ does not converge (nor does $\sin x+x\cos x$).

$f(x)=\sqrt x$ has no asymptote, because $a=0$ is true, but $b$ does not exist. In fact, $\lim_{x\to\infty}(\sqrt x-0x)=\infty$, and one can say that the asymptote is a line "at infinity".

$f(x)=\sqrt{x^2+1}$ has an asymptote because $a=1$ holds (both from $\dfrac{f(x)}x$ and $f'(x)$), and $\lim_{x\to\infty}\left(\sqrt{x^2+1}-x\right)=\dfrac12$.


Notice that in the above discussion, the higher order derivatives are never used. In particular, their respective signs are playing no role. For example, $\dfrac{\sin x}x$ has an horizontal asymptote, while all its derivatives are alternating.

So the answer to your title question is "they don't".

Related Question