[Physics] Is quantum mechanics intrinsically dualistic

dualityquantum mechanicsquantum-interpretations

In just about every interpretation of quantum mechanics, there appears to be some form of dualism. Is this inevitable or not?

In the orthodox Copenhagen interpretation by Bohr and Heisenberg, the world is split into a quantum and classical part. Yes, that is actually what they wrote and not a straw man. The Heisenberg cut is somewhat adjustable, though, somewhat mysteriously. This adjustability is also something that can be seen in other interpretations, and almost kind of suggests the cut is unphysical, but yet it has to appear somewhere. There is a duality between the observer and the observables.

Von Neumann postulated a two step evolution; one Schroedinger and unitary, and the other a measurement collapse by measurement, whatever a measurement is. Another form of the duality.
What a measurement is and when exactly it happens is also adjustable.

In decoherence, there is a split into the system and the environment. A split has to be made for decoherence to come out, but again, the position of the split is adjustable with almost no physical consequence.

In the many-worlds interpretation, there is the wavefunction on the one hand, and a splitting into a preferred basis on the other followed by a selection of one branch over the others. This picking out of one branch is also dualistic, and is an addendum over and above the wavefunction itself.

In the decoherent histories approach, there is the wavefunction on the one hand, and on the other there is an arbitrary choice of history operators followed by a collapse to one particular history. The choice of history operators depends upon the questions asked, and these questions are in dual opposition to the bare wavefunction itself oblivious to the questions.

In Bohmian mechanics, there is the wavefunction, and dual to it is a particle trajectory.

Why is there a duality? Can there be a nondual interpretation of quantum mechanics?

Best Answer

Some people ascribe the duality to the duality between the classical appratus and the quantum microscropic system, but I think this is a little old-fasioned. The quantum description also works for a bad apparatus and a big apparatus--- like my eye looking at a mesoscopic metal ball with light shining on it. This situation does not measure the position of the ball, nor the momentum, nor anything precise at all. In fact, it is hard to determine exactly what operator my eye is measuring by looking at some photons.

A modern approach to quantum mechanics treats the whole system as a quantum mechanical, including my eye, and myself. But then the source of the dualism is made apparent. If I simulate my own wavefunction on a computer, and that of the ball, and the light, (the simulation would be enormously large, but ignore that for now), where is my perception of the ball contained in the simulation?

It is not clear, because the evolution would produce a enormously large set of wavefunction values in extremely high dimension, most of which are vanishingly small, but a few of which are smeared over configurations describing one of many plausible possible outcomes. The linear time evolution would produce a multiplying collection of weighted configurations, but it will never contain a data bit corresponding to my experience. But I can introspect and find out my own experience, so this data bit is definitely accessible to me. So I can see a data bit using my mind which is not clearly extractable from this computer simulation of my mind.

The basic problem is that the knowledge in our heads is classical information, it might as well be data on a computer. But the quantum system is not made up of classical information, but of wavefunction data, and wavefunction data is not classical information, nor is it a probability distribution on classical information, so it does not have an obvious interpretation as ignorance of classical information.

The reason probability is unique is because only probability calculus has the Monte-Carlo property that if you sample the distribution and average over time-evolution of the samples, its the same as averaging over the time-evolution of the distribution. In quantum mechanics, samples can interfere with other samples, making the restriction to a collection of independent classical samples inconsistent. So I can't say the simulation is simulating one of many samples, at best I can say it is approximately simulating one of many clumps-of-samples corresponding to nearly completely decohered histories.

But when I entangle myself with a quantum system using a device which entangles itself with a quantum system, I find _by_doing_it_ that the result is probabilistic on the classical information in my mind. The classical information is determined after the entanglement event, the result is random with probabilities given by the Born rule, so the result is definitely a probability. But the result is only at best asymptotic to a probability in quantum mechanics.

Why Duality?

The duality in quantum descriptions is always between the linear evolution of the quantum mechanical wavefunction and the production of classical data according to a probability distribution. Wavefunctions are not probabilities, but when they produce classical data, they can only be probabilities, so they turn into probabilities. How exactly do they turn into probabilities?

This is the mismatch between the probabilistic calculus for knowledge and information, and the quantum mechanical formalism for states. In order to produce probabilities from pure quantum mechanics, you have to find the proper reason for why wavefunctions are linked to probabilities.

Each interpretation has a bit of a different flavor for explaining the link, but of these, Copenhagen, many-worlds, CCC, many-minds, and decoherence/consistent-histories all place the reason in the transition to a macroscopic observer-realm. The details are slightly different--- Copenhagen has a ritualized system/apparatus/observer divide, a classical-quantum divide which looks artificial. Many-worlds has an observer's path of memories, which selects which world is observed. Many-minds too, I can't distinguish between many-minds and many-worlds, not even philosophically. I think many-minds was invented by someone who misunderstood many-worlds as being something other than many-minds. Consciousness-Causes-Collapse is the same as well, except rejecting the alternate counterfactual mental histories as "nonexisting" (whatever that means exactly, I can't differentiate this one from many worlds either). Decoherence/consistent-histories insists that the path is a decoherence consistent selection which is simply a good direction in which the wavefunction has become incoherent and the density matrix is diagonal, but it is specified outside the theory. Its always the same dualism--- the classical data is not in the simulation, and we can see it in our heads, and the reduction to a diagonal density matrix is only asymptotically true, and it needs to be exactly true to work.

The variables that describe our experience of the macroscopic world are discrete packets of information with a definite value, or probability distributions on such, which are modeling our ignorance before we get the value. There's nothing else that is out there which can describe our experience. The quantum simulation just doesn't contain these classical bits, nor does it contain anything which is exactly and precisely a classical probability distribution.

Quantum mechanically simulate a particle in a superposition interacting with a miniature model brain, and light from the particle triggers a molecule in the brain to store the information about the position of the molecule, the quantum formalism will produce a superposition of at least two different configurations of the molecule and of the brain but at no point will it contain the actual value of the observed bit, nor a probability distribution for this value.

If this quantum wavefunction simulation is a proper simulation of the brain, then this internal brain has access to more information than the complete simulation contains viewed from the outside. As far as I see, there are exactly two possible explanations for this.

Many Worlds

The idea starts with the observation that you can't know in advance what it's supposed to feel like to be in a superposition, because what a physical phenomenon "feels like" is not part of physics. There is always a dictionary between physics and "feels like" which tells you how to match physical descriptions to experience. For example, matching light of a certain wavelength to the experience of seeing red.

If you simulate a classical brain, and you copy the data in the classical brain simulation, by querying the copies, you will see that they cannot differentiate between their pasts, and they will both think they are the same person. The quantum simulation contains all sorts of things inside, and it is not clear how it feels to internal things, because that all depends on how you query the things. If you query extremely unlikely components of the superposition, you can get any answer at all to any question you ask. You have to ask questions, because without a positive way to investigate the brain's feelings, there is no meaning you can assign to the assertion that it has feelings at all. When you ask the question, you must choose which branch of the simulated quantum system to query.

So there is no obvious way to embed classical experiences into the simulation, and the many-worlds interpretation takes the point of view that it is just a perceptual axiom, like seeing red, that the way our classical minds are embedded into a quantum universe is that they feel a unique path through a decohering net of spreading quantum events. A classical mind just doesn't "feel" superposed, it can't feel superposed because feelings are classical things.

The embedding into the model is just a little off because of this, and our minds have to select a path through the diverging possible histories. The path-selection by the mind produces new classical information through time, and the duality in quantum mechanics is identified with the philsophers' mind-body duality.

Quantum mechanics is measurably wrong

I think this is the only other plausible possibility. The existence of classical data in our experience make it philosphically preferrable to have a theory which can say something about this classical data, which can interpret it as a sharp value of a quantity in the theory, rather than a history-specification which is outside the physics of the theory. This can be philosophically preferred for two reasons:

  • It allows a physical identification of mental data with actual bits which can be extracted from the simulation, so that the definite bit values encoding our experiences are contained in a fundamental simulation directly, as they are in the classical model of the world.
  • It means that simulations of the physical world could be fully comprehended--- they are classical computations on classical data, or probability distributions which represent ensembles of classical data.

I think the only real reason to prefer such a theory is if it could described the world with a smaller model than quantum mechanics, one which would require fewer numbers to simulate. It seems like an awful waste to require exponentially growing resources to simulate N-particles, especially when the result in real life is almost always classical behavior with a state variable linear in N.

But the only way a theory can do this is if the theory fails to coincide with quantum mechanics at least when doing Shor's algorithm. So this position is that quantum mechanics is wrong for heavily entangled many-particle systems. In this case, the dualism of quantum mechanics would be because it is an approximation to something else deeper down which is not dual, but the approximation makes wavefunctions out of probability distributions in some unknown limit, and this limit is imperfect. So the wavefunctions are approximations to probabilities, not the other way around, and we see the real deal-- the probabilities, because on our scale, the wavefunction description is no good.

Nobody has such a theory. The closest thing is the Born version of quantum mechanics, which is computationally even bigger than quantum mechanics, and so even less philosophically satisfying.

It might be good even to find a half-way house, just a method of simulating quantum systems which does not require exponential resources except in those cases where you set up a quantum computer to do exponential things. Nobody has such a method either.

Related Question