[Math] some current research going on in foundations about

foundationslo.logicsoft-question

What is some current research going on in the foundations of mathematics about?

Are the foundations of mathematics still a research area, or is everything solved? When I think about foundations I'm thinking of reducing everything to ZFC set theory + first order logic. Is this still contemporary?

Best Answer

It is quite difficult to answer this question comprehensively. It's a bit like asking "so what's been going on in analysis lately?" It is probably best if logicians who work in various areas each answer what is going on in their area. I will speak about logic in computer science. I am very curious to see what logicians from other areas have to say about their branches of logic.

A hundred years ago logic was driven by philosophical and foundational questions about mathematics. The development of first-order logic and ZFC was one important branch of logic, but others were model theory and computability theory. In fact, computability theory led directly to the invention of modern computers. Ever since logic has had a very fruitful relationship with computer science.

One could argue that applications of logic in computer science are not about foundations of mathematics, but such an opinion can hardly be defended. Of course, many applications of logic in computer science are just that, applications, but not all of them. Some are directly related to foundations of mathematics (see below), while others are about understanding what computation is about (Turing got us on the right track, but there's a lot more that can be said about computation apart from "it's all equivalent to Turing machines"). I hope you can agree that computation is a foundational concept in mathematics, equal to such concepts as (mathematical) structure and deductive method.

We live in an era of logic engineering: we are learning how to apply methods of logic in situations that were not envisioned 100 years ago. Consequently, we need to devise new kinds of logic that are well-suited for new problems at hand. To give just one example: real-world computer systems often work concurrently with each other in an environment which acts in a stochastic fashion. Computer scientists have learned that Turing machines, first-order logic and set theory are not a very good way of describing or analyzing such systems (the computers are of course no more powerful than Turing machines and can be simulated by them, but that's hardly a helpful way to look at things). Instead they devise new kinds of logic which are well suited for their problems. Often we design logics that have good computational properties, so that machines can use them.

Another aspect of logic in computer science, which is closer to a mathematician's heart, is computer-supported formalization of mathematics. Humans are not able to produce complete formal proofs of mathematical theorems. Instead we adhere to what is best described as "informal rigour". Computers however, are very good at checking all the details of a mathematical proof. Modern tools, such as various proof assistants (Coq, HOL, Agda, Lean), are improving at a fast rate, and many believe they will soon become useful tools for the working mathematician. Be that as it may, the efforts to actually do mathematics formally are driving research in how this can be feasibly done. While some proof assistants do use first-order logic and set theory (e.g. Mizar), most use type theory. This is probably not just a fashion. It is a response to the fact that first-order logic and set theory suffice for formalization of mathematics in principle, but not so much in practice. Logicians have been telling us that in principle all mathematics can be formalized. And now that we're actually doing it, we're learning new things about logic. This to my mind is one of the most interesting contermporary developments in foundations of mathematics.