Thermodynamics – Understanding Heat Capacity in Statistical Mechanics

fluctuation-dissipationstatistical mechanicsthermodynamics

A known result from statistical mechanics is the following fluctuation-dissipation relation:

\begin{equation}
\frac{\partial^2S}{\partial E^2} = – \frac{1}{C_v T^2} \tag{1}
\end{equation}

where $C_v$ is the heat capacity of the system.

This equation looks to me a bit odd, since $S$ is additive and $C_v$ is also supposed to be additive in some generalized sense (for instance, through Kopp's law).
This behaviour is made explicit, for instance, in exercise 3.8 of "Entropy, Order Parameters and Complexity" from J. P. Sethna, where the final result to be shown is that, if there are two subsystems (1) and (2), then:

\begin{equation}
\frac{1}{C_v^{(1)}}+\frac{1}{C_v^{(2)}}=-T^2\Big{(}\frac{\partial^2S_1}{\partial E_1^2}+\frac{\partial^2S_2}{\partial E_2^2}\Big{)} \tag{2}
\end{equation}

Here the additivity of the entropy is clear, but the rule for summing the heat capacities of the two subsystems seems to be in contrast with Kopp's law. Also, supposing the two subsystems are not (or very weakly) interacting, so that $E_{tot} = E_1 + E_2$, and recalling the definition:

\begin{equation}
C_v^{(i)} = \frac{\partial E_i}{\partial T} \tag{3}
\end{equation}

one would really expect that:

\begin{equation}
C_v^{(tot)}=C_v^{(1)}+C_v^{(2)} \tag{4}
\end{equation}

while equation (2) combined with the additivity of the entropy looks to suggest that:

\begin{equation}
\frac{1}{C_v^{(tot)}}=\frac{1}{C_v^{(1)}}+\frac{1}{C_v^{(2)}} \tag{5}
\end{equation}

I'm really confused by this, so I'm looking for an explanation of this full intrigue, i.e. why is eq. (4) wrong (if so) and what is going on in eq. (2).

Best Answer

As already stated in the comments your equations (2) and (4) hold. But the conclusion (5) is wrong, because it assumes that $\partial_E$ is interchangeable with $\partial_{E_1}$ and $\partial_{E_2}$

The entropy of the total system, is of course the sum of entropy of the individual systems. However, by specifying the energy of the total system, $E_1$ and $E_2$ are not given a priori (a priori all energies $E_1$ and $E_2$ such that $E_1 + E_2 = E$ are permissible).

Now for the fluctuation-dissipation theorem to hold for the combined system, the resulting combined system must be in equilibrium. This is only the case if the temperature of the two systems was the same before bringing them into weak energetic contact (also note, that otherwise it does not make sense to talk of the heat capacity of the resulting system, as it would not have a defined temperature).

This constraint means, that the following relation must hold: $\frac{\partial S_1}{\partial E_1} = \frac{\partial S_2}{\partial E_2}$ (which is just equating the inverse temperatures of the individual systems).

Now the total entropy can be written as $$ S(E) = S_1\big(E_1(E)\big) + S_2\big(E_2(E)\big) = S_1\big(E_1(E)\big) + S_2\big(E-E_1(E)\big) $$ Where $E_1$ and $E_2$ are functions of $E$ (that can be determined from the equilibrium condition and the constraint $E = E_1 + E_2$).

That means that $$ \partial_E S(E) = (\partial_E E_1) \partial_{E_1} S_1 + (\partial_E E_2) \partial_{E_2} S_2. $$

This first derivative has a lucid structure, due to the constraint $E = E_1 + E_2$ we know that $$ \partial_E E_1 + \partial_E E_2 = 1 $$ So we get that for the two systems in equilibrium at temperature $T$ the temperature of the total system is also $T$.

The second derivative then gives $$ - \frac{1}{C_V T^2} = \partial_E^2 S(E) = (\partial_E^2 E_1) \partial_{E_1} S_1 + (\partial_E E_1)(\partial_E E_1) \partial_{E_1} S_1 + (\partial_E^2 E_2) \partial_{E_2} S_2 + (\partial_E E_2)(\partial_E E_2) \partial_{E_2} S_2$$

Which is clearly not the same as $\partial_{E_1}^2 S_1 + \partial_{E_2}^2 S_2 $. I am not sure whether you can somehow get the additivity result from this (you probably can - but I don't see a clear way ahead).


Another view that you can take is that your systems are prepared in a microcanonical ensemble. Then combining them means, that the energy $E$ of the combined ensemble can be distributed in an arbitrary fashion between the parts, such that $E_1 + E_2 = E$.

The entropy of the single systems is the logarithm of the degeneracy $\nu(E_i)$ of the energy eigenstate at $E_i$. Here you see even more clearly, why $\partial_{E} S(E_1)$ is not easily expressed.

Related Question