The third law does not say that "if the entropy of a system approaches a minimum, it's temperature approaches absolut zero." It says that if the temperature approaches absolute zero, the entropy does. These are logical converses.
The second law of thermodynamics says that entropy can only increase, so if the early universe had been in a state of maximum entropy, then the cosmos would have experienced its heat death immediately after being born. This contradicts the observation that the present universe contains burning stars, heat engines, and life. These observations imply that the early universe was in a very low-entropy state, which shows that its initial conditions were extremely finely tuned. The reasons for this fine-tuning are not explained by general relativity or the standard model. Adding inflation to the model does not cure this fine-tuning problem.[Penrose 2005]
These ideas are strongly counterintuitive to most people, since we picture the early universe as an undifferentiated soup of hot gas, very much like what we might imagine a heat-dead universe to be like. In what way is the early universe not equilibrated?
We observe that the cosmic microwave background radiation's spectrum is a blackbody curve, which would normally be interpreted as evidence of thermal equilibrium. However, this observation only really tells us that the matter degrees of freedom were in thermal equilibrium. The gravitational degrees of freedom were not. In standard cosmological models, which are constructed to be as simple as possible, there are no gravitational waves. Although the real universe presumably does have gravitational waves in it, they are apparently very weak. In a maximum-entropy universe, the gravitational modes would be equilibrated with the matter degrees of freedom, and they would be very strong, as in models like Misner's mixmaster universe.[Misner 1969]
Even in Newtonian mechanics, gravitating systems violate most people's intuition about entropy. If we psssssht a bunch of helium atoms into a box through an inlet valve, they will quickly reach a maximum-entropy state in which their density is nearly constant everywhere. But in an imaginary Newtonian "box" full of gravitating particles, the maximum-entropy state is one in which the particles have all glommed onto each other in a single blob. This is because of the attractive nature of the gravitational force.
Charles W. Misner, "Mixmaster Universe", Physical Review Letters 22(1969)1071. http://astrophysics.fic.uni.lodz.pl/100yrs/pdf/07/036.pdf
Roger Penrose, 2005 talk at the Isaac Newton Institute, http://www.newton.ac.uk/webseminars/pg+ws/2005/gmr/gmrw04/1107/penrose/
Thus, the air molecules contribute a small portion of their kinetic energy to the paddle, which is then expended as heat on the other side of the border, making the air molecules on the left colder, while air molecules on the right heat up. Doesn't this mean a decrease in entropy?
Yes it does.
However, we need to take the thermal noise of the resistor into account.
Hot resistors make noise
As discovered by John B. Johnson in 1928 and theoretically explained by Harry Nyquist, a resistor at temperature $T$ exhibits a non-zero open circuit voltage.
This voltage is stochastic and characterized by a (single sided) spectral density
$$S_V(f) = 4 k_b T R \frac{h f / k_b T}{\exp \left(h f / k_b T \right) - 1} \, . \tag{1}$$
At room temperature we find $k_b T / h = 6 \times 10^{12} \, \text{Hz}$, which is a ridiculously high frequency for electrical systems.
Therefore, for the loop of wire and resistor circuit in the device under consideration, we can roughly assume that
$$\exp(h f / k_b T) \approx 1 + h f /k_b T$$
so that
$$S_V(f) \approx 4 k_b T R \tag{2}$$
which we traditionally call the "Johnson noise" formula.
If we short circuit the resistor as in the diagram where its ends are connected by a simple wire, then the current noise spectral density is (just divide by $R^2$)
$$S_I(f) = 4 k_b T / R \, .\tag{3}$$
Another way to think about this is that the resistor generates random current which is Gaussian distributed with standard deviation $\sigma_I = \sqrt{4 k_b T B / R}$ where $B$ is the bandwidth of whatever circuit is connected to the resistor.
Johnson noise keeps the system in equilibrium
Anyway, the point is that the little resistor in the machine actually generates random currents in the wire!
These little currents cause the rod to twist back and forth for exactly the same reason that the twists in the rod induced by air molecules crashing into the paddles caused currents in the resistor (i.e. Faraday's law).
Therefore, the thermal noise of the resistor shakes the paddles and heats up the air.
So, while heat travels from the air on the left side to the resistor on the right, precisely the opposite process also occurs: heat travels from the resistor on the right to the air on the left.
The heat flow is always occurring in both directions.
By definition, in equilibrium the left-to-right flow has the same magnitude as the right-to-left flow and both sides just sit at equal temperature; no entropy flows from one side to the other.
Fluctuation-dissipation
Note that the resistor is both dissipative and noisy.
The resistance $R$ means that the resistor turns current/voltage into heat; the power dissipated by a resistor is
$$P = I^2 R = V^2 / R \, . \tag{4}$$
The noise is characterized by a spectral density given in Eq. (1).
Note the conspicuous appearance of the dissipation parameter $R$ in the spectral density.
This is no accident.
There is a profound link between dissipation and noise in all physical systems.
Using thermodynamics (or actually even quantum mechanics!) one can prove that any physical system which acts as a dissipator of energy must also be noisy.
The link between noisy fluctuations and dissipation is described by the fluctuation-dissipation theorem, which is one of the most interesting laws in all of physics.
The machine originally looked like it moved entropy from the left to the right because we assumed the resistor was dissipative without being noisy, but as explained via the fluctuation-dissipation theorem this is entirely impossible; all dissipative systems exhibit noisy fluctuations.
P.S. I really, really like this question.
Best Answer
Entropy can be understood as the degree to which a system's microstate (the details of exactly what all its component parts are doing) is not fixed by the constraints imposed by the system's surroundings. In the case of a gas (which is a good way to think of the early universe, ignoring the fact that it is really a plasma) the microstate is fixed by specifying the combination of position and momentum for each particle. We can then measure the entropy by the area of the region of a momentum/position plot that is filled by the possible states of the motion of all the particles. The following diagram shows four example cases. In each plot, the ellipse is intended to show, approximately, the range of position and momentum values of the gas particles.
I have labelled each example with a comment on temperature $T$ and density $\rho$. A low spread of momentum ($p$) values indicates a low temperature. A low spread of position ($x$) values indicates a high density. The area of the ellipse indicates the entropy. The four cases are self-explanatory (I hope).
It is true that, other things being equal, a high temperature will lead to a high entropy, because the range of $p$ values increases. However, it does not always happen because a high density brings the range of $x$ values down, and this lowers the entropy again.
The early universe is an example of case D.
The subsequent evolution of the universe is largely a movement from case D to case C. This is called an adiabatic expansion, in which the entropy does not change, even though the temperature falls.
To summarise, there is no general relation which says that hot things should have either high or low entropy, if they can also be dense. High density tends to bring the entropy down.
Finally, a comment on the early universe. There is no need to bring in the word 'infinity'. All we know is that the very early universe was very hot and very dense. At sufficiently early times it was in a parameter regime where all our knowledge of physics runs out, but this does not mean we know it was infinite in any respect (whether in terms of kinetic energy or density or volume or other such parameters).