[Physics] Is the universe a quantum computer – is light speed barrier a computational constraint

quantum mechanicsquantum-computerquantum-informationspecial-relativityspeed-of-light

There is currently a debate ongoing on leading maths blog Gödel’s Lost Letter, between Gil Kalai and Aram Harrow, with the former arguing that building a quantum computer may not be possible due to noise propagation, and the latter arguing to the contrary.

I am wondering if there is any argument to show that building a quantum computer is possible, by virtue of showing that quantum computation is evident in the physical world.

So the question is:

(A) Are there any known examples of physical interactions where macro level state transitions could be determined to only be in correspondence with an underlying quantum computation? I.e. similarly to Shor's algorithm being exponentially faster than any known classical factoring algorithm, are there any examples of known physical processes, for example perturbation stabilization in a very large particle cluster, that could be shown, assuming P<>NP, to only be efficiently solved by a quantum computation.

Some, I admit highly speculative, additional questions would then be:

(B) Is the speed of light barrier possibly a natural computational limit of our particular universe, so that for the computational complexity class of quantum mechanics, working on an underlying relational network-like spacetime structure, this is the maximum speed that the computational rules can move a particle/wave representation through a network region of the lowest energy/complexity (i.e. a vacuum)?

(C) Is quantum mechanics an actual necessity for the universe to follow classical physical laws at the macro level? The informal argument being that in many-to-many particle quantum level interactions, only the capability of each particle to compute in parallel an infinite or quantum-quasi-infinite number of paths is what allows the universe to resolve a real-time solution at the macro level.

Requesting references to research along these lines, or any arguments to support or contradict these speculations.

Best Answer

The answer to this question is a surprising no, and this is not because we don't have enough quantum systems. We have plenty. The problem is that if a natural system with a large number of particles is implementing a computation that requires exponential resources, we can't do the computation to check if quantum mechanics is accurate. Quantum mechanics might be failing all the time in highly excited highly entangled nuclear states, but we wouldn't know it, because we can't compute the exact energy levels, we can only rely on experiment.

First, for A, every quantum system with a large number of particles and strong interactions is implementing a nontrivial quantum computation, but we can't check if it is doing it correctly. For example, if you excite a uranium nucleus to a very highly excited state, so that it can emit x-rays, neutrons, and protons, and look at the radiated spectrum of stuff, the amplitudes for emission are highly complicated product of a 250 particle system with impossible-to-calculate entanglements. These calculations simply can't be done by any classical computer, so we just wouldn't know whether quantum mechanics is failing. But yes, a uranium nucleus in a 700MeV excited state is performing an impossibly complex quantum computation that we can't compute even with a computer the size of the universe.

For B--- your question is nonsensical, but the speed of light does limit the information transfer speed in a computer. This is not much of a limitation of principle, because it just says that a computation step which moves data from point A to point B will take a time proportional to the distance between A and B. This has no bearing on the computational complexity, because you can do the motion in polynomial time in the size of your memory, even if it is inefficiently layed out in a straight line. This is a red herring. The words "this is the maximum speed a massless particle can compute a resolved path for when traveling through a vacuous quantum field" are meaningless.

For C: the answer here is no--- you can just have classical mechanics, which does not require infinite sums to resolve the answer. The idea that quantum mechanics is required for reproducing classical definite answers is strange, because it is actually mysterious how this happens. In order to produce definite results from the quantum superposition muddle, you need to assume that we are splitting entities in a many-worlds picture, or else to put in definite-making laws by hand which do the same thing effecively. If nature is fundamentally classical, this is not going to matter.

Comments on the Linked Discussion

The argument Gil Kalai makes is interesting, but it is phrased poorly. Christopher Moore made the point cogently in the first of the comments here: http://rjlipton.wordpress.com/2012/01/30/perpetual-motion-of-the-21st-century/ , and I do not want to repeat too much. When you are proposing that quantum computation will fail, you are proposing that quantum mechanics is incorrect, and the failure occurs for highly entangled large physical systems.

The argument against quantum mechanics from the implausibility of a physical system doing an exponential computation is completely different from other arguments against quantum mechanics. The philosophical principle is that nature can't be that much more computationally rich than we are, because this introduces a mystical element of in-principle uncomputability in large quantum systems in nature. This principle is new as far as the literature is concerned--- but it is not due to Gil Kalai. I heard it first from CS student Abram Connely a decade ago, this was his personal beef with quantum mechanics. I found it a persuasive and interesting point, and I tried to give it an exposition in my answer here: Consequences of the new theorem in QM? . The precise formulation Kalai gives is interesting, but formulated in a sub-optimal way.

In order to believe that quantum computation is impossible, you absolutely requires a new law of physics which replaces quantum mechanics, or at least a principle to determine how quantum mechanics fails. The statement that the failure is fundamental, because the universe can't be that complicated, requires you to at least try to specify how the universe can be simplified.

It is incorrect to argue that simple implementation noise makes quantum computation infeasable, without making a proposal that there is a law of nature forbidding quantum computing entanglements. The reason is that you can just remove the noise by cooling down the system, and making the parts precise. There is no in principle limit for quantum computing size, even without error correction. Quantum error correction is central to implementation in practice, but in principle, you can just imagine a perfect computer, and come closer and closer in a colder and colder system, with no limit except how much you are willing to spend.

A failure of quantum mechanics that only affects mutual entanglements of a large number of quantum particles can easily have escaped detection, but when proposing modifications to quantum mechanics, one must check that they do not lead to things that would not have escaped detection. Things like failure of energy conservation, failure of few-particle coherence, irreversible information loss in few-body systems, friction in atomic motion, and all sorts of other things.

In order to check these things, it is insufficient to formulate the computational failure principle about an abstract computing device. One must show how this principle modifies real atomic scale wavefunction dynamics. The idea that this is a nonlinearity in the Schrodinger equation is just bad, so if you are proposing such a modification, it should be because the SE is an emergent description of a fundamentally classical system.

These ideas are due to t'Hooft, who is also skeptical of quantum computation, mostly for the same reason Einstein was skeptical of quantum mechanics. t'Hooft has given several attempts at a model for how to replace quantum mechanics with a system which will not be capable of exponential computation, and if one is proposing fundamental decoherence (which is what Gil Kalai is doing), one should do so in the context of at least a speculation on the underlying substrate.

Related Question