[Math] the high-concept explanation on why real numbers are useful in number theory

big-picturelo.logicnt.number-theory

The utopian situation in mathematics would be that the statement and the proof of every result would live "in the same world", at the same level of mathematical complexity (in a broad sense), unless there were a good conceptual reason for the contrary. The typical situation would be for a proof in finite combinatorics to be proven purely within the realm of finite combinatorics, a statement about integers to be proven using only the rationals (perhaps together with some formal symbols such as $\sqrt {2}$ and $\sqrt{-1}$), and so on. When the typical situation breaks down, the reason would be well-known and celebrated.

The prototypical field where things don't seem to work this way is Number Theory. Kronecker famously stated that "God invented the integers; all else is the work of man."; and yet, the real numbers (often in the guise of complex analysis) are ubiquitous all over Number Theory.

I am sure that this question is hopelessly naïve and standard but:

  1. What is the high-concept explanation for why real numbers are useful in number theory?
  2. What is the "minimal example" of a statement in number theory, for whose "best possible" proof the introduction of real numbers is obviously useful?

An alternative way of framing the question would be to ask how you would refute the following hypothetical argument:

"We know that calculus works well, so we are tempted to apply it to anything and everything. But perhaps it is in fact the wrong tool for Number Theory. Perhaps there exists a rational-number-based approach to Number Theory waiting to be discovered, whose discoverer will win a Fields Medal, which will replace all the analytic tools in Number Theory with dicrete tools."

(This question is a byproduct of a discussion we had today at Dror Bar-Natan's LazyKnots seminar.)

Update: (REWRITTEN) There has been some discussion in the comments concerning whether proofs and statements living in the same realm is "utopian". The philosophical idea underlying this question is that, in my opinion, part of mathematics is to understand proofs, including understanding which tools are optimal for a proof and why. If the proof is a formal manipulation of definitions used in the statement of the claim (e.g. proof of the snake lemma), then there is nothing to explain. If, on the other hand, the proof makes essential use of concepts from beyond the realm of the statement of the theorem (e.g. a proof of a statement about integers which uses real numbers, or proof of Poincare Duality for simplicial complexes which uses CW complexes) then we ought to understand why. Is there no other way to prove it?Why? Would another way to prove it necessarily be move clumsy? Why? Or is it just an accident of history, the first thing the prover thought of, with no claim of being an "optimally tooled proof" in any sense? For one think, if a proof of a result involving integers essentially uses properties of the real numbers (or complex numbers), such a proof would not work in a formal somehow analogous setting where there are no real numbers, such as knots as analogues for primes. For another, by understanding why the tool of the proof is optimal, we're learning something really fundamental about integers.
I'm interested not in "what would be the fastest way to find a first proof", but rather in "what would be the most intuitive way to understand a mathematical phenomenon in hindsight". So one thing that would make me happy would be a result for integers which is "obviously" a projection or restriction of some easy fact for real numbers, and is readily understood that way, but remains mysterious if real numbers/ complex analysis aren't introduced.

Best Answer

The Gödel Speedup Theorem provides some explanation why real numbers (and variants) are useful in proving statements in number theory.

Real numbers, complex numbers, and $p$-adic numbers are second-order objects over the natural numbers. Thus a proof of a number theoretic fact using such analytical devices is formally a proof of that fact in second-order arithmetic. The Gödel Speedup Theorem shows that there is a definite advantage to using second-order arithmetic to prove elementary number theoretic facts.

Gödel Speedup Theorem. Let $h$ be any computable function. There is an infinite family $\mathcal{H}$ of first-order (indeed $\Pi^0_2$) statements such that if $\phi \in \mathcal{H}$, then $\phi$ is provable in first-order arithmetic and if $k$ is the length of the shortest proof of $\phi$ in second-order arithmetic, then the shortest proof of $\phi$ in first-order arithmetic has length at least $h(k)$.

Since computable functions can grow very fast, this shows that there are true number theoretic facts that one can prove using second-order methods (e.g. complex analysis, $p$-adic numbers, etc.) but any first-order (a.k.a. elementary) proof is unfathomably long. Admittedly, the statements produced by Gödel to verify the theorem are very unnatural from a number theoretic point of view. However, it is a general fact that second-order proofs can be much much shorter and easier to understand than first-order proofs.


Addendum. This excellent post by Emil Jeřábek demonstrates another speedup theorem, which is in many ways more striking. The method of going from a first-order $T$ to a second-order $T^+$ is conservative, meaning that $T^+$ cannot prove more first-order theorems than $T$. However, the mere act of allowing sets to replace formulas and introducing the possibility of quantifying over such sets introduces speedups faster than any exponential tower. Introducing $\mathbb{R}$, $\mathbb{C}$, $\mathbb{Q}_p$ and so forth has a similar effect where one can package complicated ideas into conceptually simpler ones (e.g. replacing $\forall\exists$ statements by the higher-level idea of continuity) can lead to monumentally shorter proofs!