Statistical-Mechanics – Ambiguity in the Definition of Entropy

definitionentropystatistical mechanics

The entropy $S$ of a system is defined as $$S = k\ln \Omega.$$ What precisely is $\Omega$? It refers to "the number of microstates" of the system, but is this the number of all accessible microstates or just the number of microstates corresponding to the systems current macrostate? Or is it something else that eludes me?

Best Answer

Entropy is a property of a macrostate, not a system. So $\Omega$ is the number of microstates that correspond to the macrostate in question.

Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $\Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.

Related Question