[Physics] Is amount of entropy subjective

entropyinformationthermodynamics

From all sources I have seen it follows that the proof you can't decrease the amount of entropy in the Universe is given only statistically – the order is just one of the many ways how things can be (with the exception of only energy/temperature entropy, that's clear).
That is my first question, is rule that entropy always increase valid for something else (than for entropy defined as an amount of balance of energy in the Universe?
The only way out of that is, I think, define information as a physical quantity. Then we would know how much entropy increased.

Subset paradox

I have read this answer which defines information as a number of (minimum) YES/NO question you have to ask to perfectly specify the object that carries the information. But this would mean that everything (including every subset or superset which is impossible, how shows the picture) carries the same amount of information – for example if only describeable physical quantities were position and weight, my question for everything could be: "Is it true that it is there and it weighs that?"Now, let's consider a closed system consisting only of three particles.

Also following this definition of information it would be subjective what has more entropy – if I alphabetically order my books have I increased more entropy by the change in balance of energy in the room?

So how to define information correctly?
(Btw this blew my mind – if the system had no spin, polarisation or local unbalance (the electrone has mole on one side) I wouldn't have any idea how to describe the position of them in the empty universe in other way than: It's here.)

Best Answer

Entropy is subjective in the sense that you get to pick which macroscopic observables you care about keeping track of (usually, for instance, you care about things like temperature, pressure, etc.). Once you've defined the macroscopic observables, entropy is defined as the logarithm of the number of possible microstates that give rise to those macroscopic observables. If you define your "macroscopic" observables to be the position and momentum of every particle in your system (ignore, for the moment, the fact that quantum mechanics says that it is impossible to observe both at the same time), then the entropy is zero. If you include quantum mechanics, then there is inherent uncertainty about the trajectory through Hilbert space of you and the system you're interested in, so the entropy of any system you're interested in can never be zero regardless of how you define your "macroscopic" observables.

So to answer your question about rearranging your books: if the only macroscopic observable you care about is whether or not your books are alphabetized, then the entropy defined in terms of that macroscopic observable would decrease slightly when you arrange your books, although it would only decrease by an absurdly small amount because most of the entropy of your system of books comes from the microstates of the molecules in the books themselves, and not from the positions of the books. Furthermore, defining such an arbitrary macroscopic observable would really only be useful for your particular case and would not generalize to other systems of interest, which defeats the entire purpose of developing physical equations that can produce repeatable predictions in many different systems of interest.

Related Question