History Overview – John von Neumann’s Remark on Entropy

ho.history-overviewit.information-theoryquantum mechanicssoft-question

According to Claude Shannon, von Neumann gave him very useful advice on what to call his measure of information content [1]:

My greatest concern was what to call it. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.'

What I am curious about is what von Neumann meant with his last point. I find it particularly surprising given that he axiomatised what we now call the von Neumann entropy in Quantum Mechanics around two decades prior to Shannon's development of Classical Information Theory.

Might modern information theorists know of the specific difficulties he had in mind and whether these have been suitably addressed?

References:

  1. McIrvine, Edward C. and Tribus, Myron (1971). Energy and Information Scientific American 225(3): 179-190.
  2. von Neumann, John (1932). Mathematische Grundlagen der Quantenmechanik (Mathematical Foundations of Quantum Mechanics) Princeton University Press., . ISBN 978-0-691-02893-4.
  3. Shannon, Claude E. (1948). A Mathematical Theory of Communication Bell System Technical Journal 27: 379-423. doi:10.1002/j.1538-7305.1948.tb01338.x.
  4. Olivier Rioul. This is IT: A Primer on Shannon’s Entropy and Information. Séminaire Poincaré. 2018.
  5. E.T. Jaynes. Information Theory and Statistical Mechanics. The Physical Review. 1957.
  6. John A. Wheeler, 1990, “Information, physics, quantum: The search for links” in W. Zurek (ed.) Complexity, Entropy, and the Physics of Information. Redwood City, CA: Addison-Wesley.

Best Answer

An alternative version of Von Neumann's quote says "no one understands entropy very well". At the intuitive level, this makes sense, it is much harder to explain the concept of entropy to a novice than it is to explain energy.

One debate that existed at the time of Von Neumann$^1$ and is still argued upon$^2$ is whether the entropy $S=-{\rm tr}\,\rho\log\rho$ from information theory equals the physical (thermodynamic) entropy. This may or may not have been what Von Neumann was thinking about when he made the remark to Shannon, but it's one documented source of confusion.


$^1$ "In the 1950's Jaynes told Wigner that physical entropy is a measure of information and Wigner thought that was absurd, because the information one person possesses differs from that of another, whereas entropy can be measured with thermometers and calorimeters." [source]

$^2$ A Man Misunderstood: Von Neumann did not claim that his entropy corresponds to the phenomenological thermodynamic entropy (2007).