Thus, the air molecules contribute a small portion of their kinetic energy to the paddle, which is then expended as heat on the other side of the border, making the air molecules on the left colder, while air molecules on the right heat up. Doesn't this mean a decrease in entropy?
Yes it does.
However, we need to take the thermal noise of the resistor into account.
Hot resistors make noise
As discovered by John B. Johnson in 1928 and theoretically explained by Harry Nyquist, a resistor at temperature $T$ exhibits a non-zero open circuit voltage.
This voltage is stochastic and characterized by a (single sided) spectral density
$$S_V(f) = 4 k_b T R \frac{h f / k_b T}{\exp \left(h f / k_b T \right) - 1} \, . \tag{1}$$
At room temperature we find $k_b T / h = 6 \times 10^{12} \, \text{Hz}$, which is a ridiculously high frequency for electrical systems.
Therefore, for the loop of wire and resistor circuit in the device under consideration, we can roughly assume that
$$\exp(h f / k_b T) \approx 1 + h f /k_b T$$
so that
$$S_V(f) \approx 4 k_b T R \tag{2}$$
which we traditionally call the "Johnson noise" formula.
If we short circuit the resistor as in the diagram where its ends are connected by a simple wire, then the current noise spectral density is (just divide by $R^2$)
$$S_I(f) = 4 k_b T / R \, .\tag{3}$$
Another way to think about this is that the resistor generates random current which is Gaussian distributed with standard deviation $\sigma_I = \sqrt{4 k_b T B / R}$ where $B$ is the bandwidth of whatever circuit is connected to the resistor.
Johnson noise keeps the system in equilibrium
Anyway, the point is that the little resistor in the machine actually generates random currents in the wire!
These little currents cause the rod to twist back and forth for exactly the same reason that the twists in the rod induced by air molecules crashing into the paddles caused currents in the resistor (i.e. Faraday's law).
Therefore, the thermal noise of the resistor shakes the paddles and heats up the air.
So, while heat travels from the air on the left side to the resistor on the right, precisely the opposite process also occurs: heat travels from the resistor on the right to the air on the left.
The heat flow is always occurring in both directions.
By definition, in equilibrium the left-to-right flow has the same magnitude as the right-to-left flow and both sides just sit at equal temperature; no entropy flows from one side to the other.
Fluctuation-dissipation
Note that the resistor is both dissipative and noisy.
The resistance $R$ means that the resistor turns current/voltage into heat; the power dissipated by a resistor is
$$P = I^2 R = V^2 / R \, . \tag{4}$$
The noise is characterized by a spectral density given in Eq. (1).
Note the conspicuous appearance of the dissipation parameter $R$ in the spectral density.
This is no accident.
There is a profound link between dissipation and noise in all physical systems.
Using thermodynamics (or actually even quantum mechanics!) one can prove that any physical system which acts as a dissipator of energy must also be noisy.
The link between noisy fluctuations and dissipation is described by the fluctuation-dissipation theorem, which is one of the most interesting laws in all of physics.
The machine originally looked like it moved entropy from the left to the right because we assumed the resistor was dissipative without being noisy, but as explained via the fluctuation-dissipation theorem this is entirely impossible; all dissipative systems exhibit noisy fluctuations.
P.S. I really, really like this question.
We can consider that the whole system {ball + gas + room} is isolated so that the total energy is constant through time. Even in that case where the total energy remains the same, the system can evolve towards a macrostate of higher entropy than initially. Entropy is to be understood here in the sense of the number of microstates of the system {ball + gas + room} compatible with a given macrostate. Now, the macrostate for such a system will be characterised by the velocity distribution of the center of mass of the ball and the temperature of the whole system (temperature of the ball and temperature of the gas and possibly temperature of the walls of the room).
Upon colliding with the molecules of the gas and the walls, the ball will give up energy to its surrounding roughly until it has about the same kinetic energy as one molecule. As it gives up energy, the ball does not really lose entropy (first because its own entropy does not change much and second because the motion of its center of mass does not contribute much to the entropy of the system as a whole) but does contribute to increase the entropy of the gas that has now more energy. Since the gas has more kinetic energy, it has more microstates compatible with this new kinetic energy state and thus the entropy of the gas is greater than before.
Now to comment on the point
In the end, the ball will have lost all its kinetic energy and will be
in thermal balance with the room. It has lost all the entropy it could
have lost, and that is the reason why it doesn't keep acting. If it
had some entropy to lose, it would definitely keep doing something
(everything happens because everything wants to provide entropy to the
universe, and it cannot say no until it has completely been robbed of
its entropy?).
I am not sure this is the right way of formulating it. The ball on its own does not have to lose entropy for the entropy of the universe to increase. Entropy is not something that is conserved but something that gets created. It just so happens that a ball with mass $m_B$ going at velocity $\vec{v}_B$ such that $m_B ||\vec{v}_B|| \gg \sqrt{m k_B T}$ (where $m$ is the mass of a molecule) is bound to be a situation that generates more opportunities for the gas and walls molecules to occupy new states of motions unaccessible before the ball released its energy; that's how entropy is created in this case.
Best Answer
To differentiate itself from its surroundings, any living organism (no matter how simple) must decrease its entropy. Or, at least, it must ensure that its entropy increases more slowly than its surroundings. This takes energy, which creates heat. The organism must excrete this heat into its surroundings. And this means that the total entropy of the organism plus its environment increases, so the second law of thermodynamics is not broken.
We know that the presence of living organisms has significantly affected the entropy of the Earth’s atmosphere throughout its history. However, it is not clear whether all the living organisms on Earth are sufficient to significantly affect the entropy of the whole Earth. Remember the Earth is pretty big - it is roughly a billion times as massive as the global biomass. That’s an awful lot of entropy.