Thermodynamics – How to Understand the Equivalence Between Boltzmann Entropy and Shannon Entropy

entropyholographic-principleinformationthermodynamics

I have already parse other posts about this subject but none of them seems to answer completely to my interogation

[ this interrogation is to put in relation with this question Is there an equivalence between information, energy and matter? ]

indeed according to Bekenstein

the thermodynamic entropy and Shannon entropy are conceptually equivalent.

the number of arrangements that are counted by Boltzmann entropy reflects the amount of Shannon information that would be needed to implement any particular arrangement …

…of matter and energy

with

for some, thermodynamic entropy can be seen as a specific instance of Shannon entropy.
In short, the thermodynamic entropy is a Shannon entropy, but not necessarily vice versa.

for others, Shannon entropy is a mathematical quantity on ""abstract systems"" and have nothing to do with thermodynamics entropy.

so is there a consensual responce to that question:

Is there an equivalence between Boltzmann entropy and Shannon entropy..?

Best Answer

Boltzmann's entropy formula can be derived from the Shannon entropy formula when all states are equally probable.

Say you have $W$ microstates equiprobable with probability $p_i=1/W$. Then:

$S=-k\sum{p_i \ln p_i}=k\sum{ (\ln W)/W}=k\ln W$

Another way where this result can be obtained is maximising $S$ given that $\sum{p_i}=1$ using Lagrange multipliers:

$\max_{p_i}(S)= -k\sum{p_i \ln p_i} - \lambda(\sum{p_i}-1)$

Adding more constraints will result in a lower entropy distribution (such as the canonical entropy when adding the energy constraint and the grandcanonical when adding energy and particle constraints).

As a side note, it can also be shown that the Boltzmann entropy is an upperbound to the entropy that a system can have for a fixed number of microstates meaning:

$S\leq k \ln W$

This can also be interpreted as the uniform distribution being the distribution that provides the highest entropy (or least information, if you want someone was kind enough to prove this for me here https://math.stackexchange.com/questions/2748388/proving-that-shannon-entropy-is-maximal-for-the-uniform-distribution-using-conve).

Related Question