[Physics] Why does $S = k_B \ln W$ not always apply

entropystatistical mechanicstemperature

I thought for a long time that the Boltzmann formula for entropy, $S = k_B \ln W$, was a universally true statement, or rather the definition of entropy from the perspective of statistical mechanics. However, I have since come to understand that it is only applicable for an isolated system (i.e. the microcanonical ensemble) for which all microstates of the system are equally likely. The more general statement is the Gibbs entropy

$$S = -k_B \sum_i P_i \ln P_i \,.$$

However, I have seen a derivation of the Boltzmann formula such that I can't quite see why said formula doesn't always apply. I was hoping somebody could point out the error in the following reasoning.

From classical thermodynamics we know that

$$T = \left(\frac{\partial U}{\partial S}\right)_V \,.$$

Now let us consider two systems in thermal contact, allowed to exchange energy. If we suppose that all microstates of the joint system are equally likely, we argue that equilibrium will be attained for a division of energy that maximises the number of possible corresponding microstates. So we have

$$ \frac{d}{dE_1}\big(W_1(E_1)W_2(E_2)\big) = 0 \,. $$

Working through this, we get to the condition that

$$ \frac{d \, \ln W_1(E_1)}{E_1} = \frac{d\, \ln W_2(E_2)}{E_2} \,,$$

but since equilibrium corresponds to equal temperatures, we make the natural definition that

$$\frac{1}{k_B T} = \frac{d \, \ln W(E)}{d E}\,,$$
where we choose this particular form of the expression on the left since it results in energy flowing in the right direction (hot to cold) for two systems with very similar temperatures.

Putting this together with the above result, identifying $U =E$, we must have that

$$ S = k_B \ln W \,.$$

Question: where in this argument have I made any assumptions or mistakes, such that this formula applies only to a specific class of systems? Why can I not use this formula to determine the entropy of (say) one of the two systems I placed in thermal contact (in the discussion above of temperature)? Why is the Gibbs formula the correct one for systems allowed to exchange energy? We also use, I believe, this definition of temperature in the derivation of the canonical/Boltzmann distribution (see e.g. here), and yet in this case the reservoir is not an isolated system, and so I would have thought that this expression would not apply. Thank you.

Best Answer

The problem with the Boltzmann definition is, as you have neatly shown, that its usefulness depends on the assumption that your system is in equilibrium with its surroundings. Without first assuming equilibrium and subsequently setting the temperatures as equal, one cannot show that the Boltzmann entropy satisfies the First Law and hence meaningfully define it as the entropy. However the Gibbs entropy still does provide a meaningful definition because, for example, it is possible to relate it to the partition function via

$$S \equiv -k_B\sum_s P_s\ln P_s= k_B \left(\ln Z + \beta\frac{\partial \ln Z}{\partial \beta}\right)=\frac{\partial}{\partial T}(k_B T \ln Z)$$

and hence to use it to calculate other variables like the Helmholtz free energy.

Furthermore, you are assuming that $W$ is large enough to be approximated as a continuous quantity. $W$ is however an integer and the quantity $\mathrm d\ln W$ is not well defined in a system with a small number of microstates. Hence the justification you have provided here would break down. This problem never arises if you start from the definition of the Gibbs entropy and work from there.

Related Question