No, the reasoning is wrong. When you need a N-qubit state, no difference entangled or not, you just initialize N qubits and then apply a unitary transformation (reversible). So, entropy produced during the initialization is proportional to the amount of information, without exponents.
For explanation of the “how many states do N qubits have” perennial question (and refuting respective misconceptions) refer to How many states can a n qubit quantum computer store? , although there are several places on the site where different users wrote nearly the same thing about superpositions/linear combinations.
Is quantum mechanics on a measurement level a deterministic theory or a probability theory?
Probability theory. Evidence: when physicists do quantum measurements they find the results of individual runs are unpredictable. Only frequencies of multiple runs are predictable and match the theoretical results of quantum mechanics.
How can this possibly be consistent with unitarity as described above?
During a quantum measurement (measuring a system S by an apparatus A) the complete system S+A viewed at the microscopic level undergoes unitary evolution. During that evolution the system S become entangled with the apparatus A. However, by experimental design, this entanglement when viewed as a macroscopic approximation is seen to have some simplifying features:
a. The apparatus is in a mixed state of pointer states
b. The possible eigenvectors of some observable of S have coupled to the pointer states
c. Off-diagonal "interference" terms have become suppressed by decoherence due to the many internal degrees of freedom of A.
Owing to the special nature of these pointer states of A (from OP "some many-particle systems may well be approximated as classical and can store the information of measurement outcomes") we now have an objective fact about our universe.
Only one of the pointer states has in fact actually occurred in our universe (we can make this statement whether on not a physicist actually reads the pointer and discovers which universe we are actually in).
We can then make the inference that for this particular run of the S+A interaction, the system S in fact belongs to the subensemble giving rise to the occurance of this pointer state. We can make this reduction of the original ensemble based on this objective information about our universe. Restricted to this subensemble, we still have unitary evolution when viewed at the exact microscopic level.
Disclaimer: I don't know whether this really makes any sense, but this is what the reference referred to by OP seems to be saying.
Follow up question: so can we say QM is a probability theory for practical purposes but deterministic in principle?
No I think not. Here is the confusion: having banished the need for explicit wave function collapse from the QM formalism it seems that all we are left with is deterministic unitary evolution of the wavefunction of our closed system. Hence surely QM is deterministic. But no. The indeterminism in the outcome of measurements is still present in the wavefunction.
In fact the QM formalism tells us precisely when it is able to be deterministic and when not: it is deterministic whenever the quantum state is an eigenvector of the operator related to the measurement in question. Remarkably from this one postulate it is possible to derive that quantum mechanics is probabilistic (i.e. we can derive the Born Rule).
Explicitly, we can show that it is deterministic that if the evolution of S+A is run $N$ times (with $N \rightarrow \infty$) then the frequencies of different results will follow precisely the Born rule probabilities. However for a single run there is no such determinism. For a single run it is only determined that there will be an outcome.
This approach to QM is described by Arkani-Hamed here.
Edit
For a more advanced discussion of these ideas I recommend Is the statistical interpretation of Quantum Mechanics dead?
Best Answer
It's pretty simple, and there's been various questions on this site that have had this discussion. And it does get controversial, but the phsycis I straightforward.
The issue is: is the fact that the evolution of the wave function, or the state of a system, is determined uniquely by its initial conditions and the unitary operator that quantum mechanically (or equivalently for quantum field theory) is its time evolution operator? The answer is obviously yes, the wave function or system state evolves uniquely in time according to that operator. The evolution is deterministic as far as the quantum state is concerned.
This is labeled as unitary time evolution. It means that the quantum information that defines the initial state is not 'lost', but rather simply evolved into the information that defines the evolved state. Quantum states evolve deterministically if they are pure states.
In simple terms it means that quantum theory follows causal laws. Causality so not broken
There is nothing controversial there. Wave function or quantum state evolution is perfectly deterministic. What happens with statistical mixes of pure states is statistical mechanics, and does not contradict the determinism, only the practical limits on it due to the large numbers of states and interactions.
The issue comes up when you measure some observable of the state. It is then probabilistically determined exactly what you will get. It is this latter fact that has led to quantum theory be labeled as probabilistic. In doing a measurement you place that system in one of the eigenvalues of the observable operator. It is well known how to compute the probabilities of measuring any specific value. That is what is meant by saying quantum theory is not deterministic.
Note that even then, the quantum state had evolved deterministically, and it is only when you measure, or decohere the system, and interact with it with a lot of degrees of freedom, you get a classical average value with variance around it.
So if you want to determine classical observables, which means you have to measure and not simply let the quantum state go its own way, it produces the probabilistic results and has the quantum uncertainties given by the uncertainty principle for the different observable pairs. But it does not mean the state did not evolve In a perfectly unitary and causal way given by then laws of quantum theory. Sometimes it is loosely said that the wavefunction collapsed into its one observed classical value. And it could have been another. It was determined probabilistically.
That quantum information defined by the state of the system before you measure it, i.e. before you (or anything else) interacts with it, is the quantum information that cannot be lost or destroyed. It can be modified only by the deterministic time evolution operator (and of course by interactions with other particles or fields, which would be then represented in the unitary time evolution operator for it). That quantum information could be also quantum numbers that are conserved in various interactions - for instance total energy, spin, lepton number, fermion number, and others -- in those cases, given by what entities are conserved by the various SM forces.
Now, there is a Black Hole Information paradox that has surfaced because when the particles with specific quantum numbers or quantum states dissappear inside a Black Hole, you can never get them back and that equivalent information is lost. After Hawking radiation it just disappears. Quantum theory says that's impossible. Thus the paradox. There's been plenty of discussion and work on it, but no definitive resolution - probably it'll have to await a well accepted quantum gravity theory. Most physicists probably believe that there's a deeper solution, and that quantum theory causality or information will be preserved.
See the article at Wikipedia https://en.m.wikipedia.org/wiki/Black_hole_information_paradox
So yes, quantum information conservation and quantum state determinism or quantum causality are the same things.