Thermodynamics – How Temperature Changes Affect Entropy in a System

entropyheat-enginetemperaturethermodynamics

I was trying to understand why in a heat engine there is an increase in entropy. If heat (lower) quality energy is converted to work (higher quality energy), even if not all, why wouldn’t entropy decrease?!

And I got to a website that explained, that is, what I gathered from it was that, high temperature heat (heat from a higher temperature source) had more quality, a higher ability to be turned into work, than heat from a lower temperature source. [I get that it is most likely wrong to state it like this, but it’s the reasoning I made].

And so, high quality heat would come in the machine, some would be turned into work and some other would be turned into a much lower quality heat, so that, the resulting balance would be an increase in entropy. Less from the work exiting the system, but much more from the lower quality heat energy also being transferred from it, resulting in more entropy overall.
(Whichever other system work was done on would have its entropy decreased, more mechanical energy, but the surroundings would end up with a much higher entropy, lower quality thermal energy, lower temperature.)

Here’s the picture they have:

enter image description here

Which made sense to me because at lower temperatures energy is much more “spread” and not so “concentrated”.

But then, I read somewhere else, (and in many other places too, so this must be “where it’s at”, it must be correct) that: “i understand that as temperature increases, entropy increases as well, as there are more quanta of energy and more thermal states(energy levels) available.”

Which also makes sense to me: a macrostate has more entropy if it has a greater number of microstates associated with it. Higher temperature -> more possible energy levels -> more possible microstates -> higher entropy.

So how can I make sense of both things? I get the feeling they are both right in some way and I just don’t know how to connect them.
Does entropy increase with a decrease or with an increase in a system’s temperature?

I apologize for the long question and for the, most likely, wrong statements in it. Also, if it would be easy to just “google it by myself”. I have already read some other questions here, but I haven’t understood it yet. Thank you.

References:

To the image: http://energyeducation.ca/encyclopedia/Entropy

To the quote: https://www.physicsforums.com/threads/effect-of-temperature-on-entropy.517807/

Best Answer

Change in entropy is proportional to the reciprocal of temperature. So a lower temperature means less entropy, but higher temperature means less entropy per unit of energy. All else being equal, adding heat to a cold object increases entropy more than adding it to a hot one. Let's say you have a cold reservoir at 100 K and a hot one at 500 K (both have one unit of heat capacity). You extract work, and at the end they're both at 300 K. You started with log(100)+log(500) entropy, and ended with 2log(300), an increase of .255. When you started, most of the heat was in the hot reservoir, where it "counted less" for entropy, and then you moved it to the cold reservoir, where it "counts more" for entropy.

Another way of thinking about it: for energy, it's the sum that matters. But for microstates, it's the product; the total number of microstates for the system involving both reservoirs is not the sum of microstates of the hot and cold reservoirs, but the product: if the cold reservoir has n microstates, and the hot one has m, then there are n*m different combinations of sub-microstates. So if you take away microstates from the hot reservoir, and add them to the cold one, then the total number of microstates increases.