Model Theory – Incompleteness and Nonstandard Models of Arithmetic

lo.logicmodel-theorypeano-arithmeticset-theory

The following are a collection of doubts, some of which shall have concrete answers while others may have not. Any kind of help will be welcome.

Reading Peter Smith's "Gödel Without (Too Many) Tears", particularly where he gives a nonstandard model of Q, I began wondering if the reason for the existence of nonstandard models of arithmetic has anything to do with incompleteness theorems.

I do not know if categoricity implies completeness (in the sense of every sentence being decidable by proof), but anyway, it seems reasonable, when one is formalizing a given (informal) theory, to try to "force" somehow the formal theory to talk "almost exclusively" about the intended interpretation. So I started thinking if some axiom (or axiom schema) could be added to PA in order to forbid its most obvious nonstandard models.

The first idea in this line was: ok, we have our class of terms 0, S0, SS0, etc. So, if we found a way to tell that for every x there is some term to which it is equal, we would be done.

But then I realized that our terms are defined inductively and that we are making implicitly the assumption: “and nothing else is a term”, very similar to the desired “and nothing else is a number” we would like to add to PA. This thought sort of worried me: every metatheoretic concept (terms, formulas, and even proofs!) is based on assumptions like these! (I have not still found a way out of these worries).

Leaving that apart. What if we move on to a stronger theory (with different axioms, but with an extension by definitions that proves every axiom of PA), for example ZFC? Natural numbers become then 0 (the empty set) plus the von Neumann ordinals (obtained by Pair and Union) that contain no limit ordinal. The set of natural numbers is obtained from Infinity, just selecting them by Comprehension. Kunen says in page 23 of his “The Foundations of Mathematics” that the circularity in the informal definition of natural number is broken “by formalizing the properties of the order relation on omega”. Could nonstandard models survive this formalization?

Well, I think I've read somewhere that being omega is absolute, so forcing would not be a way to obtain such nonstandard models. Also, I am not sure if (the extension by definitions from) ZFC set theory is a conservative extension of PA, but then it would not be able to prove anything about natural numbers (expressible in the language of arithmetic) that PA alone cannot prove. So somehow it looks like nonstandard models must manage to survive! Maybe due to the notion of being a subset of a given set not being particularly clear (although it looks like it should not be problematic with hereditarily finite sets).

Thank you in advance.

Best Answer

Unfortunately, nonstandard models will survive any such attempt. This is guaranteed by the Löwenheim-Skolem Theorem which says that if a countable first-order theory T has an infinite model then it has one of every infinite cardinality. Since an uncountable model necessarily has nonstandard elements, this guarantees that there is a nonstandard model of T (and even countable ones).

Actually, in your case you need a "two-cardinal" version of Löwenheim-Skolem. In your ZFC example, you move to a theory which interprets arithmetic inside a definable substructure (the set ω). The definable substructure of such a model which might still be countable even if the model itself is uncountable. Nevertheless, one can still blow up the size of the natural number substructure via the ultrapower construction, for example.

To evade the Löwenheim-Skolem Theorem, one has to move beyond first-order logic. For example, in infinitary logic one allows infinite disjunctions such as $$\forall x(x = 0 \lor x = S0 \lor x = SS0 \lor \cdots)$$ which ensures that the model is standard. Also, second-order allows quantification over arbitrary sets under the standard interpretation, which again prohibits non-standard models. (See this related question.) This is the characterization of N most commonly used by working mathematicians.