[Math] Understanding limits and how to interpret the meaning of “arbitrarily close”

derivativesintuitionlimits

I have read several introductory notes on limits of functions, and in all of them they introduce the notion of a limit of a function $f(x)$ by discussing what happens to the value of $f$ as $x$ approaches a given value, say $x=a$. In doing so they use phrases of the form "if $\lim_{x\rightarrow a}f(x)=L$ exists, this means that given a value of $x$ sufficiently close to $a$ (but not equal to $a$), we can make $f(x)$ arbitrarily close to $L$". What confuses me about this is, if one has the result $\lim_{x\rightarrow a}f(x)=L$, does this mean that one should take "arbitrarily close" as "equal to"? Is it that since arbitrarily close values of $x$ to $x=a$ lead to the value of $f$ being arbitrarily close to its value at $x=a$, we can imply that the limiting value of $f$ is exactly equal to $L$!?

The primary reason I ask is because the derivative is defined as the limit of a diffence quotient that itself is undefined at the point we are approaching, so how is one to interpret the limiting value of this difference quotient $$\lim_{\Delta x\rightarrow 0}\frac{f(x+\Delta x)-f(x)}{\Delta x}=f'(x)$$ how can one state that its limiting value is exactly equal to the slope of the tangent line to the point $x$, equivalently the instantaneous rate of change in the value of the function $f$ with respect $x$ at the point $x$. How can we be certain that this is true?

I feel like I might have missing something important here. If anyone can enlighten me it would be much appreciated!

Best Answer

Let me be very frank here. For a beginner who is studying limits for the first time (meaning age around 16 years) the terms "sufficiently close" and "arbitrarily close" are very difficult to handle.

This is primarily because such a student at his stage of learning is acquainted mainly with algebraic simplifications/manipulations and the main focus in algebra is the operations $+,-,\times,/,=$. Thus most of the mathematical study is based on establishing equality between two expressions (via rules of common arithmetic operations).

In Calculus the symbols $+,-,\times,/,=$ take backstage and the focus shifts entirely to inequalities (but this point is never emphasized in any calculus textbook). Calculus or analysis is fundamentally based on the order relations like $<, >$ instead of arithmetical operations. Here we are not so much concerned about whether $a = b$ or not, but rather how much near / close $a$ is to $b$ when we already know that $a \neq b$. A measure of this nearness/closeness is given by the expression $|a - b|$ (which is something easily handled by students trained in algebra).

The next issue is that students fail to comprehend the significance of the fact that there is no smallest positive rational / real number (although students know this fact and can supply the proof very easily). Because of this fact we know that if $a \neq b$ then the expression $|a - b|$ can take whatever small positive value based on specific choice of $a, b$. Thus we can choose two distinct numbers $a$ and $b$ which are as close to each other as we please.

Calculus / analysis builds up on such phrases as close to ... as we please and introduces the terms like sufficiently close and arbitrarily close and for this purpose the very powerful notion of functional dependence is used. Thus let the numbers $a, b$ in the previous paragraph have a function dependency on some other variable. To simplify things let $a$ depend on another number $x$ via function relation $a = f(x)$ and let us keep $b$ as fixed. Thus we have a way to choose different values of $a$ by changing the value of $x$.

And then we pose the question : How close is the value $a = f(x)$ to $b$ when the value of $x$ is close to some specific fixed number $c$? Thus we are interested in figuring out how small the difference $|f(x) - b|$ is based on the difference $|x - c|$. If $|f(x) - b|$ is small when $|x - c|$ is small then we say that limit of $f(x)$ is $b$ as $x \to c$.

However to make things precise the smallness of $|f(x) - b|$ and $|x - c|$ needs to be quantified properly and when defining the concept of limit it is essential that it should be possible to make the quantity $|f(x) - b|$ as small as we please by choosing $|x - c|$ to be as small as needed. Thus the goal is to make $|f(x) - b|$ as small as we please and for this making $|x - c|$ as small as needed is a means to achieve that goal. Since the goal is primarily based on our wish (as small as we please) we say that $|f(x) - b|$ should be arbitrarily small (because our wishes are arbitrary and there is no end to supply of numbers as small as we please, remember there is no smallest positive number). And then once we have fixed our goal (say with some arbitrary small number $\epsilon$) we need to now choose $|x - c|$ small enough (or we say sufficiently small and quantify it with another small number $\delta$) to fulfill that goal.

And the next step is the formalism of greek symbols: A function $f$ defined in a certain neighborhood of $c$ (but not necessarily at $c$) is said to have limit $b$ as $x$ tends to $c$, written symbolically as $\lim\limits_{x \to c}f(x) = b$, if for any arbitrarily chosen number $\epsilon > 0$ we can find a number $\delta > 0$ such that $$|f(x) - b| < \epsilon$$ whenever $0 < |x - c| < \delta$.