[Physics] How to show that Entropy is a State Function

carnot-cycleenergyentropythermodynamics

Intro:

Thank you for reading.

I know there are lots of questions on Stack Exchange about entropy. The following is the one that most closely addressed my confusion:

Entropy as a state property

However, I'm yet a beginner, the answer went way over my head, and sadly, I'm still quite confused.

I appreciate all and any help!

Thank you.


How my Confusion Arose:

I've been watching Shankar's lectures on YouTube, in which he shows that in the isothermal expansion and contraction of a gas in a Carnot Cycle, the entropy change of the gas in the isothermal expansion and contraction cancel out.

I'm referring to the following lecture:

https://www.youtube.com/watch?v=ouSLRgkPzbI&list=PLFE3074A4CB751B2B&index=24

enter image description here

That is, $\frac{\Delta Q}{T_H}=-\frac{\Delta Q}{T_C}$. The entropy change in the upper isotherm cancels out with the entropy change in the lower isotherm.

Conceptually, that part kind of makes sense to me, since at any fixed temperature, the work that we must do to the gas (and therefor the heat the gas gives off) for a little compression is inversely proportional to the temperature (since the pressure of the gas is inversely proportional to the temperature).

Therefor, the ratio of small changes in heat to temperature, $\frac{\partial Q}{T}$, will cancel out in the upper isotherm with those in the lower isotherm.

However, Shankar from the above somehow concludes that the entropy change in the entire Carnot cycle must've been zero.

Then, he claims that entropy must be a state function, dependent only on the point at which the gas is on the $pV$ diagram.

I don't understand either of these claims.

  1. I don't understand how Shankar got to the conclusion that in the closed-loop of the Carnot cycle, the entropy change must've been zero. He showed that the entropy change in the lower isotherm cancelled out with the entropy change in the upper isotherm. However, he’s forgetting to pay attention to the entropy changes in the adiabatic expansions and contractions. From his explanation, I could’ve just as easily come to the conclusion that points $d$ and $a$ have the same entropy, and points $b$ and $c$ have the same entropy, and then entropy wouldn’t be a state function.

Edit – The above question (1) was answered…changes in entropy in an adiabatic process are zero. However, the following questions still hold:

  1. Additionally, after this, I don't understand how he gets to the conclusion that in any closed loop the entropy change is zero. I understand that in the Carnot Cycle, it was zero. But why in any closed loop?

  2. Finally, why does the fact that in any closed-loop the entropy change is zero imply that entropy is a state function, and we can assign each point on the diagram a unique entropy, even if we have not yet discovered an equation for it (at that point in time, we only seemed to have an equation for entropy changes). Perhaps I can accept this so long as we DO move in a specific path on the $pV$ diagram. But…we calcualate entropy changes for processes that can't even be drawn on the $pV$ diagram, like the free expansion of a gas into a vacuum. How come we're allowed to do this –(yes, I know, because its a state function…but, why is it a state function)?

I feel like if I understood the above, I could follow the rest of his lecture, and understand entropy quite a bit better! But…I'm super confused 😔.

Thank you.

(As for what "the rest of his lecture" is, after he deduces that entropy is a state-function, he calculates the entropy change of a gas expanding into a vacuum by assuming it instead expanded along an isotherm and claiming that it wouldn't make a difference in terms of entropy change as long as the starting point and endpoint were the same, and shows that processes that happen spontaneously in the universe correspond to entropy increases of the universe, and from there arises the second law of thermodynamics).


Edit – Directly Addressing the Question Linked In The Intro:

Addressing the question linked in the introduction to this post…I love the answer which @joshphysics gives…but, its going way over my head. I'm hoping that pherhaps, in answering this question, people could elaborate on his answer there.

How can we show that Entropy is a state Function, if it seems that in the process of its discovery as a good way to characterize a gas in a certain state, they didn't even initially have a definite function for it, but only for its changes?

That is, referencing the question linked above, why is it that (and maybe its all obvious to some…but I don't really understand it):

Fact. (Physics) $\int_\gamma \frac{\delta Q}{T} = 0$ for any closed path γ in thermodynamic state
space.

Why does that imply that…

Claim 1. $\delta Q/T$ is conservative, namely $\int_{\gamma_1} \delta Q/T = \int_{\gamma_2} \delta Q/T$ for any two path segments $\gamma_1$ and $\gamma_2$ with the same endpoints.

Why in the world does…

Claim 2. A $1$-form (this is just a fancy term for the kind of mathematical object $\delta Q/T$ is) is conservative if and only if it is exact (exact means it can be written as the differential of a scalar function)

And how the heck does all that mean…

Desired Result. There exists a scalar function $S$ such that $dS = \delta Q/T$.

Thank you!


Best Answer

This is more a mathematical problem than a physical one.

As I understand, you accept that $ \oint_\gamma \frac{\delta Q}{T} = 0$ if $\gamma$ is a Carnot cycle.

Now, any cycle in the thermodynamic state space can be approximated with fragments of Carnot cycles with an aritrary precision. It can be shown that for such cycles you will also have $ \oint_\gamma \frac{\delta Q}{T} = 0$ because the region surrounded by the graph can be divided into many subregions that are surrounded by Carnot cycles, and $\oint_\gamma \frac{\delta Q}{T} = \sum_i \oint_{\gamma_i} \frac{\delta Q}{T}$.

Since the precision of approximation can be as good as we want, by continuity we can prove that $ \oint_\gamma \frac{\delta Q}{T} = 0$ for any cycle.

Now, consider two curves (processes) going from point A of the state space to the point B, let's call them $\gamma_1$ and $\gamma_2$. If you follow one of the curves and then backtrack the other, you get a cycle $\gamma$. You have $$ 0 = \oint_\gamma \frac{\delta Q}{T} =\int_{\gamma_1} \frac{\delta Q}{T} -\int_{\gamma_2} \frac{\delta Q}{T} $$ that is $$ \int_{\gamma_1} \frac{\delta Q}{T} =\int_{\gamma_2} \frac{\delta Q}{T} $$

The curves $\gamma_1$ and $\gamma_2$ were chosen arbitrarily, which means that the integral $\int_\gamma \frac{\delta Q}{T}$ over a curve (process) $\gamma$ linking two states $A$ i $B$ does not depend on the specific of the process; it can only depend on the starting and ending points.

Let us choose an arbitrary state $O$. For $A$ being another state, let us define a function $$ S(A) = \int_O^A \frac{\delta Q}{T}$$ where the integration is performed over any curve linking state O to state A (as w ehave proven, it doesn't matter which one). This is what we called entropy. By construction, it's a function of state.

It can also be proven that $$ \Delta S_{AB} = S(B)-S(A) = \int_A^B \frac{\delta Q}{T}$$ where the integration is over any process from state A to state B.

There is some arbitrariness in the definition of the entropy function (choosing state $O$) but choosing a different state $O'$ would only change the entropy function by a constant.