[Math] What are examples of theorems which were once “valid”, then became “invalid” as standard definitions shifted

big-listDefinitionsho.history-overviewsoft-question

That is, results established by correct proofs within some framework, yet the manner in which their author or the general mathematical community at the time would describe these results would, in later times, be interpreted as constituting a false claim, due to changing fashions as to how to standardly formalize some of the relevant concepts.

I imagine this sort of thing has happened often (e.g., with shifting accounts of "polyhedra" a la Lakatos' "Proofs and Refutations", or a motley of different definitions of "continuity" before standardization on the one we use now), but I do not have enough awareness of history to be able to provide solid examples (e.g., it seems plausible to me that Darboux may have considered himself to have proven that every derivative is continuous, taking the intermediate value property to be defining for continuity, but I do not know if this is an accurate account of what he claimed).

Best Answer

(This is basically a copy of my answer https://mathoverflow.net/questions/35468#35644 )

A prime example for a theorem that was considered "valid" but later became "invalid" is the following:

Theorem (Cauchy) Let $S_m(x) = \sum_{n=0}^m f_n(x)$ be the partial sums of a series on the interval $a \leq x \leq b$. If

  1. $S_m(x)$ is continuous for all finite $m$
  2. and $S_m(\xi)$ converges to $S(\xi)$ for all numbers $\xi$ in the interval

then the sum $S(x)$ is also continuous.$\square$

From the modern (Weierstraß) point of view, this theorem is wrong. A well-known counterexample is the trigonometric series ("sawtooth")

$$\sum_{k=1}^{\infty} \frac{\sin(kx)}k$$

which is not continuous at $x=0$.


However, this is not a counterexample to Cauchy's theorem as Cauchy understood it. His definitions of continuity and convergence were based on infinitesimals and the series violates condition 2. The point is that $\xi$ may be an infinitesimal.

In particular, let $n=\mu$ infinitely large and $\xi = \omega := \frac1\mu$ infinitesimally small. Then, the residual sum is

$$S(\omega) - S_{\mu-1}(\omega) = \sum_{k=\mu}^{\infty} \frac{\sin(k\omega)}k = \sum_{k=\mu}^{\infty} \frac{\sin(k\omega)}{k\omega}\omega \approx \int_{\omega\mu}^{\infty} \frac{\sin t}{t} \ dt = \int_1^{\infty} \frac{\sin t}{t} \ dt$$

Clearly, the integral is finite and not negligible; hence, the series does not converge for $\xi=\omega\approx 0$.

Put differently, condition 2 in Cauchy's sense is actually equivalent to uniform convergence. (I think)


I have taken this discussion and example from Detlef Laugwitz's paper "Definite values of infinite sums: Aspects of the foundations of infinitesimal analysis around 1820" (in particular pages 211-212).