[Math] Is the theory of dual numbers strong enough to develop real analysis, and does it resemble Newton’s historical method for doing calculus

abstract-algebracalculusmath-historynonstandard-analysisreal-analysis

I've been interested in non-standard analysis recently. I was reading up on it and noticed the following interesting comment on the Wikipedia page about hyperreal numbers, right after giving an example of a nonstandard differentiation:

The use of the standard part in the definition of the derivative is a rigorous alternative to the traditional practice of neglecting the square of an infinitesimal quantity… the typical method from Newton through the 19th century would have been simply to discard the $dx^2$ term.

I've never heard anything like this before, and really find it fascinating that Newton's method was to define the relation $dx^2 = 0$. If we actually formalize the above structure by taking $\mathbb{R}$ and adjoining an element $dx^2 = 0$ to it, we get the "dual numbers," isomorphic to the quotient ring $\mathbb{R}[x]/x^2$. I'd seen some things about how this algebra plays into automated differentiation algorithms for some computer software systems, but I've never heard anything about Newton directly working in this algebra. So I have a few questions:

  1. Does anyone have more historical information on the way that Newton performed differentiation, and its relation to the dual numbers?
  2. Does anyone know how effectively real analysis can be formalized with the dual numbers? Does the resulting system play nice enough to develop all of the important modern results?
  3. If we start with $\mathbb{C}[x]/x^2$ instead, can we likewise develop complex analysis?

Since this idea is so simple, I'm very curious how powerful it is. I'm also curious if it has any major drawbacks too, since I'm not sure why anyone would mess with the foundational baggage involved in defining the hyperreals if this simple 2-dimensional real algebra could really do the trick.

Best Answer

The biggest draw back (and it's a big one) is that the ring of dual numbers is not a field. It has plenty of zero divisors. So, Newton, or any of the mathematicians of the early days of calculus, certainly did not work directly in the ring of dual numbers. They of course did not consider the ring to exist (as rings did not exist at all yet), but from their writing it is clear they envisaged a field of real numbers with, somehow, some notions of infinitesimals. Their work is of course very vague, but correct. Much more on that can be found in math history books. Many interesting discussions can be found in the recent book "Adventures in Formalism", also related to the early days of calculus and how things developed.

Some (rather unsatisfactory) portions of analysis can be developed in the ring of dual numbers, but it does not go too far. The idea, as you say, is very simple, perhaps too simple. One immediately gets into trouble when trying to define the derivative as the quotient of the infinitesimal $f(x+h)-f(x)$ divided by $h$, where $h$ is infinitesimal. The difficulty is that the non-zero infinitesimals in the ring of dual numbers are not invertible. So, it's the end of the party. (As you say though, some aspects of the party remain with automatic differentiation). In some sense, the dual numbers form a first order approximation to actual infinitesimals: The square of an infinitesimal is of an order of magnitude smaller than the infinitesimal you started with, but in the ring of dual numbers, the square of an 'infinitesimal' is precisely $0$. So, in a nonstandard model of the reals you have whole layers of infinitesimals. In the dual numbers there is only one layer, nothing in it is invertible, and they all square to $0$.

The book Models for smooth infinitesimal analysis explores many different models for analysis with infinitesimals. None of them is particularly simple.

Related Question