[Math] What kind of “mathematical object” are limits

analysiscalculusfunctionslimitsreal-analysis

When learning mathematics I tend to try to reduce all the concepts I come across to some matter of interaction between sets and functions (or if necessary the more general Relation) on them. Possibly with some extra axioms thrown in here and there if needed, but the fundamental idea is that of adding additional structure on sets and relations between them.

I've recently tried applying this view to calculus and have been running into some confusions. Most importantly I'm not sure how to interpret Limits. I've considered viewing them as a function that takes 3 arguments, a function, the function's domain and some value (the "approaches value") then outputs a single value.

However this "limit function" view requires defining the limit function over something other then the Reals or Complexes due to the notion of certain inputs and outputs being "infinity". This makes me uncomfortable and question whether my current approach to mathematics is really as elegant as I'd thought. Is this a reasonable approach to answering the question of what limits actually "are" in a general mathematical sense? How do mathematicians tend to categorize limits with the rest of mathematics?

Best Answer

Do you by any chance have a computer science background? Your ideal of reducing everything (even operations like limits) to function and sets has a flavor of wanting mathematics to work more or less like a programming language -- this is a flavor that I (being a computer scientist) quite approve of, but you should be aware that the ideal is not quite aligned with how real mathematicians write mathematics.

First, even though everything can be reduced to sets and functions -- indeed, everything can be reduced to sets alone, with functions just being sets of a particular shape -- doing so is not necessarily a good way to think about everything all of the time. Reducing everything to set theory is the "assembly language" of mathematics, and while it will certainly make you a better mathematician to know how this reduction works, it is not the level of abstraction you'll want to do most of your daily work at.

In contrast to the "untyped" assembly-level set theory, the day-to-day symbol language of mathematics is a highly typed language. The "types" are mostly left implicit in writing (which can be frustrating for students whose temperament lean more towards the explicit typing of most typed computer languages), but they are supremely important in practice -- almost every notation in mathematics has dozens or hundreds of different meanings, between which the reader must choose based on what the types of its various sub-expressions are. (Think "rampant use of overloading" from a programming-language perspective). Mostly, we're all trained to do this disambiguation unconsciously.

In most cases, of course, the various meanings of a symbol are generalizations of each other to various degrees. This makes it a particular bad idea to train oneself to think of the symbol of denoting this or that particular function with such-and-such particular arguments and result. A fuzzier understanding of the intention behind the symbol will often make it easier to guess which definition it's being used with in a new setting, which makes learning new material easier (even though actual proofwork of course needs to be based on exact, explicit definitions).

In particular, even restricting our attention to real analysis, the various kinds of limits (for $x\to a$, $x\to \infty$, one-sided limits and so forth) are all notated with the same $\lim$ symbols, but they are technically different things. Viewing $\lim_{x\to 5}f(x)$ and $\lim_{x\to\infty} f(x)$ as instances of the same joint "limit" function is technically possible, but also clumsy and (more importantly) not even particularly enlightening. It is better to think of the various limits as a loose grouping of intuitively similar but technically separate concepts.

This is not to say that there's not interesting mathematics to be made from studying ways in which the intuitive similarity between the different kind of limits can be formalized, producing some general notion of limit that has the ordinary limits as special cases. (One solution here is to say that the "$x\to \cdots$" subscript names a variable to bind while also denoting a net to take the limit over). All I'm saying is that such a general super-limit concept is not something one ought to think of when doing ordinary real analysis.

Finally (not related to your question about limits), note that the usual mathematical language makes extensive use of abstract types. The reals themselves are a good example: it is possible to give an explicit construction of the real numbers in terms of sets and functions (and every student of mathematics deserves to know how), but in actual mathematical reasoning numbers such as $\pi$ or $2.6$ are not sets or functions, but a separate sort of things that can only be used in the ways explicitly allowed for real numbers. "Under the hood" one might consider $\pi$ to "really be" a certain set of functions between various other sets, but that is an implementation detail that is relevant only at the untyped set-theory level.

(Of course, the various similarities between math and programming languages I go on about here are not coincidences. They arose from programming-language design as deliberate attempts to create formal machine-readable notations that would "look and feel" as much like ordinary mathematical symbolism as they could be made to. Mathematics had all of these things first; computer science was just first to need to name them).