[Math] What are the implications of the new quasi-polynomial time solution for the Graph Isomorphism problem

computational complexitycomputer sciencegraph theory

This week, news came out that Laszlo Babai has found a quasi-polynomial time algorithm to solve the Graph Isomorphism problem (that is: $O(\exp(polylog(n)))$). He is giving a series of talks this week, and the abstracts are here. I'm not an expert in this field (for instance, I had to look up what polylog means), but clearly this is a very significant result. Babai is a giant in this field, so let's assume for now that the claimed result is correct.

(1) What are some implications of this result in complexity theory?

For example: This problem is NP, but is not known to be NP-complete. If it were NP complete, and if the algorithm were polynomial time, then we'd know P = NP and that would be huge. Does this new result give us any insight into the class of problems that are NP but not known to be NP complete? Does it prove that Graph Isomorphism is not in some other class of problems (e.g. does anyone study the class of exponential but not quasi-polynomial time problems)?

(2) Are there other important problems that reduce to the graph isomorphism problem that are now known to be solvable in quasi-polynomial?

Babai's abstract mentions the Coset Intersection problem and the String Intersection problem. I can guess what those are from the name. Are there other well-studied problems that reduce to graph isomorphism?

(3) How far is quasi-polynomial from polynomial?

Based on the linked thread above, polylog generally means $O(\log(n)^k)$ in complexity theory. If so, then exp[\log(n)^k]) = exp[k]*n, which doesn't look all that much like polynomial time, unless you can bound $k$ by a constant that doesn't depend on $n$. But perhaps I'm misunderstanding the whole story, in which case I'd love to be set straight. What would be the next step to getting the bound lower, from the perspective of complexity theory?

Obviously, the proof hasn't appeared in print yet, so I'm not asking if the proof can somehow be made polynomial instead of quasi-polynomial, and I'm definitely not asking if it's correct. I'm asking what it implies if it's correct, and how much it changes the current playing field for complexity theory.

EDIT (based on the comments):

(4) Is there any known implication of this new result in Quantum Computing?

Best Answer

The short answer is that there are no implications.

As Scott Aaronson said on his blog, the fact that even a polynomial time algorithm for GI has no implications for complexity classes is one reason some people conjectured that GI might be in P. He continued:

In fact, the main implication for complexity theory, is that we’d have to stop using GI as our flagship example of a problem that has statistical zero-knowledge proof protocols and dozens of other amazing properties, despite not being known to be in P, which would make all those properties trivial. :-) It’s similar to how the main implication of PRIMES in P, was that we had to stop using primality as our flagship example of a problem that was known to be in randomized polynomial time but not deterministic polynomial time. Still, I managed to adapt in my undergrad class, by talking about the randomized test for primality, and then treating the fact that it’s actually in P as just a further amazing wrinkle in the story. And I’d be happy to adapt similarly in the case of GI…

Similarly there are no implications for quantum computing.

As for the prospects for further improvement, the best reference as of this writing may be Jeremy Kun's notes on Babai's talk. One key point is that Babai's algorithm relies on a strategy that he calls "split or Johnson" and he can show that this strategy is not enough to put GI in P.

UPDATE: Babai has posted a preprint to the ArXiv.

UPDATE: Babai's talk at the Institute for Advanced Study is available online. IMO this talk is much better than the very first talk he gave at Chicago; he's had time to prepare slides and greatly clarify the presentation.