My question concerns the argument given by Gauss in his "geometric proof" of the fundamental theorem of Algebra. At one point he says (I am reformulating) :
A branch (a component) of any algebraic curve either comes back on itself (I suppose that means : it is a closed curve) or it goes to infinity on both sides.
I have a geometric intuition of what it means, but I am not sure
where or how to get a "modern" formulation of such result, and a proof.
Best Answer
Gauss actually appends a footnote to this statement "if a branch of an algebraic curve enters a limited space, it necessarily has to leave it again" (Latin original follows below), in which he argues that:
As explained by Harel Cain (see also Steve Smale), this outline of the proof shows that Gauss’s geometric proof of the FTA is based on assumptions about the branches of algebraic curves, which might appear plausible to geometric intuition, but are left without any rigorous proof by Gauss. It took until 1920 for Alexander Ostrowski to show that all assumptions made by Gauss can be fully justified.
Alexander Ostrowski. Über den ersten und vierten Gauss’schen Beweis des Fundamental satzes der Algebra. (Nachrichten der Gesellschaft der Wissenschaften Göttingen, 1920).
Here is the Latin original and an English translation of
Carl Friedrich Gauss. Demonstratio nova theorematis omnem functionem algebraicam rationalem integram unius variabilis in factores reales primi vel secundi gradus resolvi posse (PhD thesis, Universitat Helmstedt, 1799); paragraph 21 and footnote 10.
[English translation] (Wayback Machine):