[Math] Failure of differential notation

calculusderivativesreal-analysis

Through the informal use of differentials, the product rule can be "proved" by writing
$$d(fg) = (f + df)(g + dg) – fg = df\,g + f\,dg + df\,dg.$$
Neglecting the product of two differentials, we conclude that
$$d(fg) = df\,g + f\,dg.$$
However, the accepted answer to this question mentions that manipulations like this are not always justified. In particular, he points out that it is unclear why we should not neglect a single differential (itself an "infinitesimal" quantity), but we should neglect their product (presumably since it's "infintesimal-er").

Can someone produce an example in which a line of reasoning similar to the above argument for the product rule leads to a false conclusion (preferably from single-variable calculus)? Another way to phrase the question is this: What failures of the informal use of differentials led to the development of non-standard analysis?

Best Answer

The answer to your question really depends on the formalism with which you develop a rigorous treatment of infinitesimal numbers. In Robinson nonstandard analysis and related formalisms, the notion of standard part fixes everything. For example, you have the following proof of the product rule in this setting:

\begin{eqnarray*} (fg)'(x) & \equiv & st((f(x+dx)g(x+dx)-f(x)g(x))/dx) \\ & = & st(((f(x)+f'(x)dx)(g(x)+g'(x)dx)-(f(x)g(x))/dx) \\ & = & st((f'(x)g(x)dx+g'(x)f(x)dx+g'(x)f'(x)dx^2)/dx)\\ & = & st(f'(x)g(x)+g'(x)f(x)+g'(x)f'(x)dx) \\ & = & f'(x)g(x)+g'(x)f(x) \end{eqnarray*}

The intuition here is that the terms with only a single $dx$ are of the same order as the change in $x$, so they are relevant to the first order behavior. The terms with two $dx$ are of a higher order as the change in $x$, so they are not relevant to the first order behavior (they are relevant to the second order behavior).

On the other hand, in smooth infinitesimal analysis and related formalisms, there is a collection of numbers which are by definition nilsquare (along with a collection which is nilcube etc.) In that setting there is no $st$ on the outside of everything, but instead the term $g'(x)f'(x)dx^2$ is exactly zero by definition, and everything else goes through the same.

Related Question