How large does $N$ need to be for statistical mechanics to be a good approximation

approximationserror analysisstatistical mechanicsstatistics

About how many components ($N$) does a system need for statistical mechanics to apply to that system?

I took stat mech and biophysics from the same professor in undergrad and I distinctly remember him saying that part of the reason that biophysics was so intractable is because systems were large, but not so large that the thermodynamic limit made sense. I think he said biophysical systems often had $N \sim 10^2-10^4$ components while stat mech really only made sense for systems with $N>10^6$(?), but I really don't remember the exact values for $N$.

Best Answer

I think it's unfair to ask for an exact value for $N$ to justify all statistical mechanics. There are very many different problems and applications of stat mech, and some of them might have intrinsically low variance and work fine for relatively small $N$, whereas in other problems really require $N \rightarrow \infty$.

With that said, it's easy to see why statistical mechanics works so much better for (say) a gas of particles than for many biological systems. As you mention in your post, biological systems often deal with $N \sim 100$ or $1,000$ degrees of freedom, which is firmly in the "mesoscopic" regime. On the other hand, a reasonably sized box of air will contain something on the order of a mole of particles, ie $6\times 10^{23}$ particles, which is twenty orders of magnitude larger than the biological system. So you can see why you typically don't need to scratch your head over whether $10^6$ or $10^7$ particles is enough to justify stat mech: typically, the number of particles is so unimaginably huge that you can often (but not always!) approximate the system by taking the thermodynamic limit $N \rightarrow \infty$.

Another nice semi-quantitative heuristic: very often one approximates particles as non-interacting in statistical mechanics. In this case, many observable quantities are given by a sum of independent and identically distributed random variables, such that the sum can be approximated to good precision as Gaussian distributed by the central limit theorem. Then, you expect that the standard deviation of the distribution will fall as $1/\sqrt{N}$ as $N$ becomes large. From this, you can get a rough idea of how good statistical mechanics will be as $N$ increases: for instance, $1/\sqrt{100}$ is $0.1$ for common biological systems, while $1/\sqrt{10^{23}}$ is $3 \times 10^{-12}$ for a box of gas particles.

Related Question