[Physics] Entropy of two expanding and mixing ideal gases

entropyhomework-and-exercisesthermodynamics

I am currently stuck with the following problem:

 _________________________         _________________________
|                 |       |    \  |                         | 
|        V1       |   V2  | ====> |        V1+V2            |
|                 |       |    /  |                         |      
|_________________|_______|       |_________________________| 

In a box there are two different gases in two compartments seperated by a solid wall. The Temperature $T$ of both gases is the same and both gases consist of $1mol$ particles. Therefore on the left side of the box, you have a volume of $V_1$ with a pressure of $p_1$ and on the right hand side we have volume $V_2$ and pressure $p_2$. No we remove the separating wall.

Lets first assume that we have to different gases on the left and on the right side. My goal is to calculate the change of entropy which occurs due to the removal of the wall. My first Ansatz was:

$\Delta S_1 = nR\ln(\frac{V_1+V_2}{V_1})$

$\Delta S_2 = nR\ln(\frac{V_1+V_2}{V_2})$

Where $\Delta S$ is the change of entropy of the system, $n$ the number of mols and $R$ the universal gas constant. The total change of entropy would now be the sum of both:

$\Delta S_{tot} = S_1 + S_2 = nR\ln(\frac{(V_1+V_2)^2}{V_1V_2})$

After a while I thought about the problem in a different way: First I move the wall isothermally so that $V_1=V_2$. The entropy change due to the movement would be:

$\Delta S_{mov1} = nR\ln(\frac{V_1+V_2}{2V_1})$

$\Delta S_{mov2} = nR\ln(\frac{V_1+V_2}{2V_2})$

$\Delta S_{mov} = S_{mov1} + S_{mov2} = nR\ln(\frac{(V_1+V_2)^2}{4V_1V_2})$

Now i remove the wall and get a entropy change due to the mixing of the gasses:

$\Delta S_{rem} = R(n_1 ln(\frac{n_1+n_2}{n_1}) + n_2 \ln(\frac{n_1+n_2}{n_2})) = 2nR\ln(2)$

so

$\Delta S_{tot} = \Delta S_{rem} + \Delta S_{mov} = nR\ln(\frac{(V_1+V_2)^2}{4V_1V_2}) + 2nR\ln(2)$

which is clearly different from the result I got from my approach in the first Ansatz. However since entropy is a state variable this should not happen. No I am stuck. I tend to the second result, but I am very far from sure. It would be great if you could explain which ansatz is correct (if any^^).

Moreover if I had identical gases in $V_1$ and $V_2$, how would the entropy behave? Since the pressure changes after the removal of the wall, I would say that the entropy also changes. Just like in my second Ansatz, however without the mixing term. Is this correct?

||~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

EDIT: Oh, dear. I just realized that $2nRln(2)$ can be written as $nRln(4)$. Therefore

$nR\ln(\frac{(V_1+V_2)^2}{4V_1V_2}) + 2nR\ln(2) = nR\ln(\frac{(V_1+V_2)^2}{V_1V_2})$

so both ansatzes give the same result. Since this is cleared out, my second question remains. In case of two identical gases does the entropy change according to my second ansatz (without the mixing entropy)?

Thanks in advance

ftiaronsem

Best Answer

What you've stumbled upon is called the "Gibbs paradox", and the resolution is to divide the phase space for entropy calculations in statistical mechanics by the identical particle factor, which reduces the number of configurations.

Since the temperature is unchanged in the process, the momentum distribution of the atoms is unimportant, it is the same before and after, and the entropy is entirely spatial, as you realized. The volume of configuration space for the left part is:

${V_1^N \over N!}$

and for the right part is:

${V_2^N\over N!}$

And the total volume of the 2N particle configuration space is:

$(V_1V_2)^N\over (N!)^2$

When you lift the barrier, you get the spatial volume of configuration space

$(V_1 + V_2)^{2N} \over (2N)!$

When $V_1$ and $V_2$ are equal, you naively would expect zero entropy gain. But you do gain a tiny little bit of entropy by removing the wall. Before you removed the wall, the number of particles on the left and on the right were exactly equal, now they can fluctuate a little bit. But this is a negligible amount of extra entropy in the thermodynamic limit, as you can see:

${(2V)^{2N}\over (2N)!} = {2^{2N}(N!)^2\over (2N)!}{V^{2N}\over (N!)^2}$

So that the extra entropy from lifting the barrier is equal to:

$ \log ({(2N)!\over 2^{2N}(N!)^2})$

You might recognize the thing inside the log, it's the probability that a symmetric +/-1 random walk returns to the origin after N steps, i.e. the biggest term of the Pascal triangle at stage 2N when normalized by the sum of all the terms of Pascal's triangle at that stage. From the Brownian motion identity or equivalently, directly from Stirling's formula), you can estimate its size as ${1\over \sqrt{2\pi N}}$, so that the logarithm goes as log(N), it is sub-extensive, and vanishes for large numbers.

The entropy change in the general case is then exactly given by the logarithm of the ratio of the two configuration space volumes before and after:

$e^{\Delta S} = { V_1^N V_2^N \over (N!)^2 } { (2N)! \over (V_1 + V_2)^{2N}} = { V_1^N V_2^N \over ({V_1 + V_2 \over 2})^{2N}} {(2N)!\over 2^{2N}(N!)^2}$

Ignoring the thermodynamically negligible last factor, the macroscopic change in entropy, the part proprtional to N, is:

$\Delta S = N\log({4 V_1 V_2 \over (V_1 + V_2)^2})$

up to a sign, it is as you calculated.

Additional comments

You might think that it is weird to gain a little bit of entropy just from the fact that before you lift the wall you knew that the particle numbers were exactly N, even if that entropy is subextensive. Wouldn't that mean that when you lower the wall, you reduce the entropy a tiny subextensive amount, by preventing mixing of the right and left half? Even if the entropy decrease is tiny, it still violates the second law.

There is no entropy decrease, because when you lower the barrier, you don't know how many molecules are on the left and how many are on the right. If you add the entropy of ignorance to the entropy of the lowered wall system, it exactly removes the subextensive entropy loss. If you try to find out how many molecules are on the right vs how many are on the left, you produce more entropy in the process of learning the answer than you gain from the knowledge.