[Math] What are the primitive notions of real analysis

axiomsdefinitionreal-analysis

My dad introduced me to primitive notions in geometry in high school. It's come back to haunt me as I study real analysis; I find myself wondering, Have we given this a formal definition?

Specifically, my professor did a proof of $1>0$ which was intended not as an example but a necessary theorem. That theorem would perhaps have been of some use if it was the multiplicative identity element is greater than the additive identity element, not having yet determined what elements of the reals they corresponded to. However, we already had axioms stating that one was the multiplicative identity element and zero the additive identity element (which seems to be neccessary).

To my mind, if it is necessary to prove $1>0$, then we can no longer assume properties of the reals which I would have assumed. But what properties of the reals are we assuming? Then I found myself wondering, have we defined addition? Have we defined multiplication? What are we defining formally here and what are we leaving informal (at least for the purposes of real analysis)?

What basic concepts are necessary as primitive notions in real analysis?

Best Answer

Your second paragraph looks like you've been fooled into thinking that just because the axioms mention the symbols $0$ and $1$, these symbols are necessarily supposed to stand for the zero and one you know from primary school. It's good you noticed that this doesn't quite make sense here.

What's going on (I'm assuming, based on what you write, that you're in a somewhat typical first-year analysis course) is that that most of the things about arithmetic on real numbers we learn in school are not really proved there. They are just being asserted forcefully by the teacher and students are badgered into accepting them under threat of failing. There are arguably good reasons for this: high school math is supposed to be useful for students who are not going to be mathematicians, and who don't have the inclination to worry about proof details when they can just be told the facts.

However, this is not a satisfactory level of development for real mathematics, so sooner or later you'll need to go back and prove properly that arithmetic on reals makes sense and behaves like you've been indoctrinated to believe it does. For various reasons, this usually happens "later" rather than "sooner". Until then, what you need to do is keep carefully track of what we have assumed about it, such that we can verify that it will all get proved eventually. These assumptions is where the axioms for a complete ordered field come in.

Therefore, assume that in good time we will find that there is some meaningful set, which we will call $\mathbb R$, with some operations $+$ and $\times$ and a relation $<$ defined on it, such that such-and-such axioms about them hold. These are then the primitive notions: addition, multiplication, and ordering. They haven't been defined, but you've decided once and for all on some properties that their eventual definition will have to satisfy. Another "primitive" notion that is assumed as a background theory is some elementary set theory which will allow you to speak about functions and sets of reals, et cetera.

$0$ and $1$ are usually not primitive notions (but this varies somewhat between developments) but defined as the unique elements that react in a certain way to $+$ and $\times$.

At some time in the careful development based on the axioms, you will have have proved that there's a certain subset of $\mathbb R$ that behaves just like the natural numbers under $+$ and $\times$ -- and from point on you're supposed to suddenly remember basic arithmetic on natural numbers from primary school, including decimal notation and algorithms up to long division, and use that freely in proofs and exercises. (Some lesser authors won't state this point directly, but just seem to tire of the formalism after some time and begin tacitly to assume that arithmetic on $\mathbb N$ works the way it does).

(A different reason for basing real analysis on some explicit axioms is that there are various ways to construct the real numbers working from scratch -- e.g., either by Dedekind cuts or by Cauchy sequences. We want to make sure that the results someone gets in analysis doesn't depend on whether his favorite construction of the reals proceeds this way or that. The clearest way to do that is to develop all of analysis from a small set of "neutral" axioms for the reals that one can then prove is statisfied by whichever construction of the reals one likes).

Curiously, it seems to be relatively uncommon to state an explicit set of axioms for natural number arithmetic at this point. Perhaps educators assume that everybody who reaches a university mathematics class will be able to convince themselves that base-10 arithmetic makes sense. Or perhaps it is just that the things you need to remember about elementary arithmetic is hard to wrap up in a nice tidy package like the complete ordered field axioms. One could of course present the Peano axioms, but getting from there to the fact that it's OK to rely on grade-school long division when doing exercises is not trivial.