[Math] Gauss proof of fundamental theorem of algebra

ag.algebraic-geometryho.history-overview

My question concerns the argument given by Gauss in his "geometric proof" of the fundamental theorem of Algebra. At one point he says (I am reformulating) :

A branch (a component) of any algebraic curve either comes back on itself (I suppose that means : it is a closed curve) or it goes to infinity on both sides.

I have a geometric intuition of what it means, but I am not sure
where or how to get a "modern" formulation of such result, and a proof.

Best Answer

Gauss actually appends a footnote to this statement "if a branch of an algebraic curve enters a limited space, it necessarily has to leave it again" (Latin original follows below), in which he argues that:

It seems to be well demonstrated that an algebraic curve neither ends abruptly (as it happens in the transcendental curve $y = 1/\log x$), nor lose itself after an infinite number of windings in a point (like a logarithmic spiral). As far as I know nobody has ever doubted this, but if anybody requires it, I take it on me to present, on another occasion, an indubitable proof.

As explained by Harel Cain (see also Steve Smale), this outline of the proof shows that Gauss’s geometric proof of the FTA is based on assumptions about the branches of algebraic curves, which might appear plausible to geometric intuition, but are left without any rigorous proof by Gauss. It took until 1920 for Alexander Ostrowski to show that all assumptions made by Gauss can be fully justified.

Alexander Ostrowski. Über den ersten und vierten Gauss’schen Beweis des Fundamental satzes der Algebra. (Nachrichten der Gesellschaft der Wissenschaften Göttingen, 1920).


Here is the Latin original and an English translation of

Carl Friedrich Gauss. Demonstratio nova theorematis omnem functionem algebraicam rationalem integram unius variabilis in factores reales primi vel secundi gradus resolvi posse (PhD thesis, Universitat Helmstedt, 1799); paragraph 21 and footnote 10.

Iam ex geometria sublimori constat, quamuis curvam algebraicam, (sive singulas cuiusuis curvae algebraicae partes, si forte e pluribus composita sit) aut in se redientem aut utrimque in infinitum excurrentem esse, adeoque si ramus aliquis curvae algebraicae in spatium definitum intret, eundem necessario ex hoc spatio rursus alicubi exire debere. [*]

[*] Satis bene certe demonstratum esse videtur, curvam algebraicam neque alicubi subito abrumpi posse (uti e.g. evenit in curva transscendente, cuius aequatio $y=1/\log x$), neque post spiras infinitas in aliquo puncto se quasi perdere (ut spiralis logarithmica), quantumque scio nemo dubium contra rem movit. Attamen si quis postulat, demonstrationem nullis dubiis obnoxiam alia occasione tradere suscipiam.

[English translation] (Wayback Machine):

But according to higher mathematics, any algebraic curve (or the individual parts of such an algebraic curve if it perhaps consists of several parts) either turns back into itself or extends to infinity. Consequently, a branch of any algebraic curve which enters a limited space, must necessarily exit from this space somewhere. [*]

[*] It seems to have been proved with sufficient certainty that an algebraic curve can neither be broken off suddenly anywhere (as happens e.g. with the transcendental curve whose equation is $y = 1/\log x$ ) nor lose itself, so to say, in some point after infinitely many coils (like the logarithmic spiral). As far as I know, nobody has raised any doubts about this. However, should someone demand it then I will undertake to give a proof that is not subject to any doubt, on some other occasion.