Set Theory – Major Consequences of the Inconsistency of ZFC

computational complexitylo.logicset-theory

Update (21st April, 2019). Removed the reference / initial trigger behind my question (please see comment thread below for the reasons). Am retaining, of course, the actual question, noted both in the title as well as in the post below.

The key questions of this post are the following:

1. How "disastrous" would inconsistency of ZFC really be?

2. A slightly more refined question is: what would be the major consequences of different types of alleged inconsistencies in ZFC?


Old material (the "no-longer relevant" part of the question).

I was happily surfing the arXiv, when I was jolted by the following paper:

Inconsistency of the Zermelo-Fraenkel set theory with the axiom of choice and its effects on the computational complexity by M. Kim, Mar. 2012.

Abstract. This paper exposes a contradiction in the Zermelo-Fraenkel set theory with the axiom of choice (ZFC). While Godel's incompleteness theorems state that a consistent system cannot prove its consistency, they do not eliminate proofs using a stronger system or methods that are outside the scope of the system. The paper shows that the cardinalities of infinite sets are uncontrollable and contradictory. The paper then states that Peano arithmetic, or first-order arithmetic, is inconsistent if all of the axioms and axiom schema assumed in the ZFC system are taken as being true, showing that ZFC is inconsistent. The paper then exposes some consequences that are in the scope of the computational complexity theory.

Now this seems to be a very major claim, and I lack the background to be able to judge if the claim is true, or there is some subtle or even obvious defect in the paper's arguments. But picking on this paper itself is not the purpose of my question.

If you feel that my questions might not admit "clearly right" answers, I will be happy to make this post CW.

Best Answer

I'm confident that ZFC is consistent, but one can imagine an inconsistency. Like François said, it would probably be handled pretty well. I'd divide the possibilities into four cases:

  1. A technicality, like separation vs. comprehension in ZFC. This would be an important thing to get right, but it would have little impact on the theorems mathematicians prove. (For example, Frege's system was inconsistent, but his mistake didn't propagate.)

  2. A topic requiring serious clarification, like infinitesimals in the 1600's. The intution was right, but it took some genuine work to turn this intuition into actual theorems with rigorous proofs.

  3. A topic that fundamentally cannot be clarified, where some part of mathematics just turns out to be defective. For example, imagine if cardinals beyond $\aleph_0$ were inherently self-contradictory, and no clarification could save them. This would require huge modifications to set theory.

  4. It could turn out that we have no idea what any of mathematics really means. For example, if Peano Arithmetic were inconsistent, then it would call into question the whole axiomatic approach to mathematics. It would be tantamount to saying that the natural numbers as we understand them do not exist. (Some parts of the axiomatic approach could still survive, but I don't think it would be wise to trust anything if we couldn't even get the consistency of PA right.)

My feeling is that 1 is very unlikely, 2 would be among the biggest shocks in the history of mathematics, 3 is difficult to imagine, and 4 is so extreme that if I read a proof of the inconsistency of PA, I'd be more likely to decide that I had gone crazy than that PA was actually inconsistent.

Related Question