I'm working on a new, much improved version of my paper. Please note that I am not a fundamentalist, like, as it seems, some of my critics. I don't have an open telephone line with God, like Einstein on one side, and Motl on the other. So what I am doing in my paper (which will come out as a book, eventually), is I simply explore the idea that the usual counter arguments against a simple, deterministic interpretation of qm can be ignored, and I ask what you may get. The answer is quite interesting, but yes, one does encounter some interesting problems. The most severe, technical problems one gets are totally unrelated to the usual emotional arguments against deterministic qm, so I ask: are these totally prohibitive then, or is there a way out? Would that also answer the usual Motl-like objections?
The most obnoxious problem I get: how does one arrive at an effective Hamiltonian that is both bounded from below and locally defined (or: extensive) ? There may be very interesting answers; one of them says that yes, the entire theory obeys complete locality - all physics is local - but the ultimate Hamiltonian of qm is non-local. This means that the phases of wave functions of far-away particles enter in the physics equations in a non-trivial way, while nothing of this has effects on the predictions of qm, which, in the usual qm way, are local.
But that does not have to be the answer. An other possible answer I find much more interesting and natural. You know that there are lots of crackpots who claim that you can "disprove the Bell theorem". Most of those totally miss the point, but there is a way. Bell assumes that in the initial state, the entangled particles just separating from one another, and Bob and Alice, who did not yet make their decisions, are fundamentally uncorrelated. That's because Alice and Bob must have "free will". There are two points to be considered to see why this may well be wrong. One is, that correlation functions do not have to vanish outside the light cone (look at QFT, but also look at simple classical systems such as liquids showing critical opalescence near the critical point); the other is directed to those who believe that only "conspiracy" can force Bob and Alice then the mike the "right" decisions. No, there can be something else. If you have a deterministic underlying theory, then there are two kinds of states: the truly `ontological' ones, and the templates, which are quantum superpositions. In ordinary qm, we do not distinguish between the two, but when it comes to the question of realism, you must. Then we note that there is a simple conservation law of nature: once a state is ontological, it will stay that way forever. A template will forever be a template. This means that, no matter what Alice and Bob decide, they will not be able to rotate their polarisers in such a way that the photons come out as superpositions of the other choices they wanted to make. They will have to rotate objects in their environment as well, so that, after changing their minds, they will again work with an ontological state.
Of course, Alice and Bob cannot change their settings without essential changes in their past, and, in probability terms, they might change their state into a much less (or more) likely one.
By the way, the notion of probability enters into my theory in a very simple way: it exactly corresponds to the uncertainties in the initial state, which are reflected in the use of the templates. This leads to the (EXACT) Born rule. Please wait until the improved version of my paper comes out.
Although your statement that
it can be shown that unitary evolution of quantum systems conserves von Neumann entropy,
is indeed true, in quantum theory and in particular in any "collapse" interpretation the evolution of a quantum system is governed by such unitary evolution except when the system is being measured; at those times the system undergoes a nonunitary projective collapse that does not conserve von Neumann entropy and is therefore free to tamper with the information content of the world. (I'm not sure one can argue a projective measurement either creates or destroys information. What I do think though is that it certainly makes the world more (or at least not less) entropic: if we collapse without knowing the answer, there's one more thing to find out about the world.)
Best Answer
Probability theory. Evidence: when physicists do quantum measurements they find the results of individual runs are unpredictable. Only frequencies of multiple runs are predictable and match the theoretical results of quantum mechanics.
During a quantum measurement (measuring a system S by an apparatus A) the complete system S+A viewed at the microscopic level undergoes unitary evolution. During that evolution the system S become entangled with the apparatus A. However, by experimental design, this entanglement when viewed as a macroscopic approximation is seen to have some simplifying features:
a. The apparatus is in a mixed state of pointer states
b. The possible eigenvectors of some observable of S have coupled to the pointer states
c. Off-diagonal "interference" terms have become suppressed by decoherence due to the many internal degrees of freedom of A.
Owing to the special nature of these pointer states of A (from OP "some many-particle systems may well be approximated as classical and can store the information of measurement outcomes") we now have an objective fact about our universe.
Only one of the pointer states has in fact actually occurred in our universe (we can make this statement whether on not a physicist actually reads the pointer and discovers which universe we are actually in).
We can then make the inference that for this particular run of the S+A interaction, the system S in fact belongs to the subensemble giving rise to the occurance of this pointer state. We can make this reduction of the original ensemble based on this objective information about our universe. Restricted to this subensemble, we still have unitary evolution when viewed at the exact microscopic level.
Disclaimer: I don't know whether this really makes any sense, but this is what the reference referred to by OP seems to be saying.
No I think not. Here is the confusion: having banished the need for explicit wave function collapse from the QM formalism it seems that all we are left with is deterministic unitary evolution of the wavefunction of our closed system. Hence surely QM is deterministic. But no. The indeterminism in the outcome of measurements is still present in the wavefunction.
In fact the QM formalism tells us precisely when it is able to be deterministic and when not: it is deterministic whenever the quantum state is an eigenvector of the operator related to the measurement in question. Remarkably from this one postulate it is possible to derive that quantum mechanics is probabilistic (i.e. we can derive the Born Rule).
Explicitly, we can show that it is deterministic that if the evolution of S+A is run $N$ times (with $N \rightarrow \infty$) then the frequencies of different results will follow precisely the Born rule probabilities. However for a single run there is no such determinism. For a single run it is only determined that there will be an outcome.
This approach to QM is described by Arkani-Hamed here.
Edit
For a more advanced discussion of these ideas I recommend Is the statistical interpretation of Quantum Mechanics dead?