[Physics] How do different definitions of entropy connect with each other

cosmologyentropyinformationstatistical mechanicsthermodynamics

In many places over the Internet, I have tried to understand entropy.

Many definitions are presented, among which I can formulate three (please correct me if any definition is wrong):

  1. Entropy = disorder, and systems tend to the most possible disorder
  2. Entropy = energy distribution, and systems tend to the most possible energy distribution
  3. Entropy = information needed to describe the system, and systems tend to be described in less lines
  4. Entropy = statistical mode, and the system tends to go to a microscopic state that is one of the most abundant possible states it can possess.

Now, I have these contrary examples in my mind:

  1. Disorder => how about a snowflake? What is disorder? How do we agree on what is ordered and what is disordered? Because to me a snowflake is a perfect example of order.
  2. Energy distribution => then why Big Bang happened at all? As they say, universe was one tiny point of energy equally distributed. Now the universe is a repetition of energy density and void.
  3. Information => we can describe the universe before the Big Bang in one simple sentence: an energy point, X degrees kelvin. But we need billions of billions of lines of descriptions to be able to describe the universe.
  4. Mode => Again, before the big bang, or even in early epochs of the universe we had uniform states that were the most abundant possible states.

I'm stuck at this very fundamental philosophical definition.

I can understand the "cup of coffee" example of course, or the "your room gets messy over time" example. Those are very clear examples. But I'm stuck at these examples. Can you clarify this for me please?

Best Answer

Your concern about the too many definitions of entropy is well-founded. Unfortunately, there is an embarrassing confusion, even in the scientific literature on such an issue. The answers you may find even in the SE sites just mirror this state of things.

The short answer is that there is nothing like a unique concept of entropy. There are many different but correlated concepts, which could have been named differently. They have some direct or indirect relation with thermodynamic entropy, although they usually do not coincide with it without additional assumptions.

Just a partial list of different concepts, all named entropy contains

  1. Thermodynamic entropy.
  2. Dynamical system entropy.
  3. Statistical mechanics entropy.
  4. Information theory entropy.
  5. Algorithmic entropy.
  6. Quantum mechanics (von Neumann) entropy.
  7. Gravitational (and Black Holes) entropy.

Although all these quantities are named entropy, they are not entirely equivalent. A schematic list of the range of systems they can be applied and some mutual relation could help organize a mental map in such a confusing conceptual landscape.

Let me add a preliminary disclaimer. I am not going to write a comprehensive treatise on each possible entropy. The list is intended as an approximate map. However, even if I may be missing some important relation (I do not claim to be an expert on every form of entropy!), the overall picture should be correct. It should give an idea about the generic non-equivalence between different entropies.

1. Thermodynamic entropy

It can be applied to macroscopic systems at thermodynamic equilibrium or even non-equilibrium systems, provided some sort of a local thermodynamic equilibrium (LTE) can be justified for small regions of the system. LTE requires that each subregion is large enough to neglect the effect of relative fluctuations (local thermodynamic quantities have to be well defined), and the relaxation times are faster than typical dynamic evolution times. Usual thermodynamics requires the possibility of controlling the work and heat exchanged by the system and crucially depends on some underlying microscopic dynamics able to drive the system towards equilibrium.

2. Dynamical system entropy

The present and other items should contain sublists. Under this name, one can find entropies for abstract dynamical systems (for example, the metric entropy introduced by Kolmogorov and Sinai) and continuous chaotic dynamical systems. Here, the corresponding entropy does not require an equilibrium state, and recent proposals for non-equilibrium entropies ( an example is here ) can be classified under this title.

3. Statistical mechanics entropies

Initially, they were introduced in each statistical mechanics ensemble to provide a connection to the thermodynamic concept. In principle, there is one different entropy for each other ensemble. Such different expressions coincide for a broad class of Hamiltonians only at the so-called thermodynamic limit (TL), i.e., for systems with a macroscopically large number of degrees of freedom. Notice that Hamiltonians have to satisfy some conditions for TL could exist. Apart from the coincidence of entropies in different ensembles, TL is also required to ensure that the statistical mechanics' entropies would satisfy some key properties of thermodynamic entropy, like convexity properties or extensiveness. Therefore, one could say that the statistical mechanics' entropy is a generalization of the thermodynamic entropy, more than being equivalent.

3. Information theory entropy

This entropy is the well-known Shannon's formula $$ S_{info}= -\sum_i p_i \log p_i $$ where $p_i$ are the probabilities of a complete set of events.

It is clear that $S_{info}$ requires only a probabilistic description of the system. There is no requirement of any thermodynamic equilibrium, the energy of a state, and no connection exists with work and heat, in general. $S_{info}$ could be considered a generalization of the statistical mechanic entropy, coinciding with that only in the case of an equilibrium probability distribution function of thermodynamic variables. However, $S_{info}$ can be defined even for systems without any intrinsic dynamics.

4. Algorithmic entropy

In the present list, it is the only entropy that can be assigned to an individual (microscopic) configuration. Its definition does not require large systems, probability distribution, intrinsic dynamics, or equilibrium. It is a measure of the complexity of a configuration, expressed by the length of its shortest description.

The relation of algorithmic entropy and information entropy is that if there is an ensemble of configurations, the average value (on the ensemble) of the algorithmic entropy provides a good estimate of the information entropy. However, one has to take into account that the algorithmic entropy is a non-computable function.

6. Quantum mechanics (von Neumann) entropy

Although different from the formal point of view can be considered a generalization of Shannon's ideas to describe a quantum system. However, concepts like thermal equilibrium or heat do not play any role in this case.

7. Gravitational (and Black Holes) entropies

A set of stars in a galaxy can be thought of as systems, at least in LTE. However, their thermodynamics is quite peculiar. First of all, it is not extensive (energy grows faster than the volume). The ensembles' equivalence does not hold, and it is well known that the microcanonical specific heat is negative. A similar but not precisely equal behavior is found for the Black Hole entropy proposed by Beckenstein. In this case, the quantity that plays the role of entropy is the area of the horizon of the events of the Black Hole. Although it has been shown that this entropy shares many properties of thermodynamic entropy and can be evaluated within String Theory by counting the degeneracy of suitable states, its connection with thermodynamic entropy remains to be established

What about disorder?

It remains to discuss the relation between entropies (plural) and disorder.

It is possible to associate to each entropy a specific concept of disorder. But it is easy to guess that, in general, it won't be the same for all.

The only disorder associated with thermodynamic entropy is the disorder connected to how extensive quantities are stored in different subsystems of the same macroscopic state. Within thermodynamics, a well ordered macroscopic state is a state where extensive quantities are spatially concentrated. The maximum disorder coincides with a spread of the extensive state variables to ensure the same temperature, pressure, and chemical potential in each subvolume.

Within classical statistical mechanics, one can associate disorder to the number of available microstates in the phase space. Notice however, that this disorder, in general, has nothing to do with the usual definition of spatial order. The reason is connected with the non-intuitive role of inter-particle interactions and the fact that the statistical mechanic entropy is related to counting the number of microstates.

Probably, the entropy with the closest connection with the usual meaning of disorder is the algorithmic entropy. But that is also the most difficult to evaluate and the farthest from the thermodynamic entropy.


A small postscript

A pedagogical illustration of the complete decoupling between configurational order and entropy comes from the Sackur-Tetrode formula for classical ideal gas entropy. It shows that the entropy is directly proportional to the atoms' mass, while the accessible configuration space and the probability of each spatial configuration are the same.