[Physics] Calculating the ideal mixing entropy using Gibbs’ entropy formula

entropystatistical mechanicsthermodynamics

Two distinguishable gases are in separate volumes $xV$ and $(1-x)V$ $(x\in [0,1])$ respectively, and the number of particles on each side is $xN$ and $(1-x)N$ respectively. The volumes are separated by some tap or something at first. If we then open the tap, the gases will mix, the ideal mixing entropy is given by
$\Delta S=-Nk_B(x\ln x + (1-x)\ln(1-x))$.

I'm trying to get this result using Gibbs' entropy formula $S=-k_B\sum_i P_i\ln P_i$.
My first way is considering the case of 1 particle, such that we only have 2 states with probabilities $x$ and $1-x$, so then $S=-k_B(x\ln x + (1-x)\ln(1-x))$. Then using that entropy is extensive we get the answer. Correct?

Now I'm trying to derive the entropy by using the binomial distribution: the chance of $n$ particles being in the volume $Vx$ is $P(n)=\binom{N}{n}x^n(1-x)^{N-n}$. Inserting this in Gibbs' formula yields:

$S=-k_B\sum_nP(n)\ln\left(\binom{N}{n}x^n(1-x)^{N-n}\right)$, then expanding the log gives
$S=-k_B\sum_nP(n)\left(\ln\left(\binom{N}{n}\right)+n\ln x +(N-n)\ln(1-x)\right)$, then using that $<n>=xN$ we get
$S=-Nk_B(x\ln x + (1-x)\ln(1-x))-k_B\sum_nP(n)\ln\left(\binom{N}{n}\right)$.

How do I get rid of this last term? Or is the problem that I calculated $S$ rather than $\Delta S$?

Best Answer

The problem you are having is that you are not applying the formula $S = {\sum_{\ i}}\ p_i \log p_i$ correctly. (Here I will take $k_B=1$.) In the formula, the $i$ is supposed to index over the set of all microscopic states. I will first say what you did wrong, then I will say how to do the problem correctly.

To summarize the problem, you have a volume of gas with left and right parts separated by a partition so that the left part occupies a fraction $x$ of the volume. Then $V_L = x V$ where $V$ is the total volume and $V_L$ is the volume of the left part. Also the densities are the same so that $N_L = x N$ where $N_L$ is the number of particles in the left part and $N$ is the total number of particles. The volume and number of particles on the right side is $V_R$ and $N_R$.

You said the probability of there being of there being $n$ particles on the left is $p(n)=\binom{N}{n}x^n(1-x)^{N-n}$. This is a perfectly true statement. The next thing you did was to plug this into the formula $S = {\sum_{\ i}}\ p_i \log p_i$. The result would give you the right entropy if there is exactly one microscopic state where $n$ particles are on the left and $N-n$ are on the right. However, this is not the case. There are many microscopic states like this.

Now lets see how to do the problem correctly. For the systems considered here, we assume each of the $N_s$ microscopic states is equally likely. Then the probability of a given microscopic state is $1/N_s$. Thus $p_i = 1/N_s$ Our formula for the entropy becomes $S=\sum_{\ i} \frac{1}{N_s} \log{N_s} = N_s * 1/N_s \log{N_s} = \log(N_s).$

The number of microscopic states is equal to the volume of available phase space. Since there is no spatially varying potential, the total volume of available phase space can be written as a product $V_C V_P$ of the available volume in configuration space times the available volume in momentum space. Then $S=\log(N_s) = \log(V_C) + \log(V_P)$. Now for our problem, the volume available in momentum space does not change when the wall is removed, therefore is contribution to the entropy does not change. Therefore we must only consider the change $\log(V_C)$.

What is $V_C$ initially? Well, $V_C$ for a gas of $N$ indistinguishable particles in a volume $V$ is $V^N/N!$. The N! arises because the particles are indistinguishable so two states that are different only by permuting the particles should not be counted separately. Now initially we have two independent systems. The left system has a configurational volume $V_L^{N_L}/N_L!$, and the right system has a configurational volume $V_R^{N_R}/N_R!$. The total configurational volume is the product of these two, so that $V_C=\frac{V_L^{N_L}}{N_L!}\frac{V_R^{N_R}}{N_R!}$. Then the inital configurational entropy is $S_i = \log(V_C) = \log(\frac{V_L^{N_L}}{N_L!}) + \log(\frac{V_R^{N_R}}{N_R!}) =N_L\log(\frac{V_L}{N_L!}) + N_R\log(\frac{V_R}{N_R!}) $.

After the barrier has been removed, each particles has a volume $V$ available to it instead of just $V_R$ or $V_L$, thus the final configurational entropy is $S_f = N_L\log(\frac{V}{N_L!}) + N_R\log(\frac{V}{N_R!}) $.

The change in entropy is $S_f - S_i = N_L\log(\frac{V}{N_L!}\frac{N_L!}{V_L}) + N_R\log(\frac{V}{N_R!}\frac{N_R!}{V_R}) \\ =N_L\log(\frac{V}{V_L}) + N_R\log(\frac{V}{V_R}) \\ = N_L\log(1/x) + N_R\log(1/(1-x))\\ =xN\log(1/x) + (1-x)N\log(1/(1-x))\\ =N(x\log(1/x) + (1-x)\log(1/(1-x))\\ =-N(x\log(x) + (1-x)\log(1-x).$

This is the answer you wanted.

Related Question