First of all, nantennas in general don't violate the second law of thermodynamics, so they are not perpetual motion machines of second kind. As long as the total entropy goes up, the second law is obeyed. In other variables, it really means that a part of the incoming heat has to heat the nantenna up but there may still be a lot of energy left for energy production, much like in any other heat engine.
The Wikipedia suggestion that natennas could violate the second law only referred to a particular application hypothesized by Mr Novack. If he could be cooling the room while getting energy out of it, and if the gadget to cool the room were not connected to any cooler heat bath, then it would indeed be a perpetual motion machine of second kind and it would be impossible.
The reason why Nature makes it impossible is kind of trivial. If the room has temperature $T$, then the nantenna or "power plant" may only be kept at the same temperature $T$ if there's equilibrium. But if that's the case, the nantenna emits thermal radiation, too. So even if it absorbs some incoming radiation, it still radiates its own. They're balanced and the energy gain is zero. Solar cells and "legitimate applications" of nantennas can only create energy because they work with incoming light whose "own" temperature is higher than the temperature of the solar cell or nantenna itself. For example, solar radiation has the temperature comparable to 5,500 Celsius degrees.
The solar cells are effectively heat engines operating between this high temperature and a much lower temperature of the ground. The same is really true about life on Earth, too. The energy from the Sun may be converted and is often converted to useful energy or work because the high-energy photons from the Sun – which correspond to a high temperature and therefore a low entropy per unit energy ($E\sim TS$) – are processed on Earth and the energy is finally emitted in much lower-temperature "infrared" thermal photons – which carry a higher entropy. So the entropy can go up even if a part of the incoming energy is converted to useful work. The temperature inequality between the solar surface (and the solar radiation) on one hand and the cool temperature of the outer space is necessary for the Sun to play this often praised beneficial role.
The second law holds on average for systems of any size, large or small. If you have an isolated contraption containing just a few atoms, and you run it through some procedure (maybe as simple as waiting 5 seconds, or maybe more complicated), there is some probability that the atoms will wind up in a lower-entropy configuration at the end of the procedure than the start. That's what your professor was referring to.
However, the probability of random entropy reduction is not high, and certainly not 100%. The key point is that the average change in entropy, upon many repetitions of the procedure, cannot be negative for any procedure.
When people hear this, they get an idea: "I'll run the procedure 100 times, and check each time whether or not the entropy got randomly lowered, and somehow I'll only use that 1 successful run to power the perpetual motion machine while throwing away the 99 unsuccessful runs." Unfortunately, it doesn't work that way! One would refer to this whole process (involving 100 runs of the original procedure) as being just one run of a different (more complicated) procedure, which also now involves extra effort / energy to check whether or not the entropy was lowered each time. (Otherwise you wouldn't know which of the 100 runs to use). That checking process creates enough entropy to undo the benefit of repeated runs.
This kind of stuff is commonly discussed under the heading of Maxwell's Demon.
If you have a small magnet diffusing around by brownian motion, it will indeed transfer energy to a stationary metal nearby via eddy currents. Unfortunately, the reverse process happens at exactly the same rate: The electrons in that metal randomly jiggle, creating currents that create magnetic fields that push on the diffusing magnet, thus transferring energy from the metal to the magnet. The total energy transfer rate is equal in both directions, or more specifically, as long as both parts start at the same temperature, they stay at the same temperature.
UPDATE WITH MORE DETAILS
A better way to define entropy is to say "We don't know exactly what the microstate (microscopic configuration) of a system is, instead our best information is that there a probability distribution of possible microstates. Then the entropy is $S = k_B \sum_n p_n \log p_n$ where $p_n$ is the probability of microstate $n$.
Note that entropy is observer-dependent, in the sense that one observer may have more information about the probability distribution than another. In more concrete terms, a system might be disordered to one observer, but a different observer knows a "magic recipe" for undoing that disorder. For example, I could create a seemingly-unpolarized beam of light by switching the polarization of a laser every nanosecond according to a pseudo-random sequence. I know the sequence, and therefore I can use a waveplate to get back a perfectly polarized beam with no intensity losses. But for somebody who doesn't know my pseudo-random sequence, the beam really needs to be treated as unpolarized, and they cannot polarize it without losses, according to the 2nd law.
A configuration of a small number of molecules might randomly become "more ordered" in some sense, but that doesn't mean it has a lower entropy. Until you measure it, you don't know that it became more ordered, and therefore you cannot make use of that order. All you have is a probability distribution for what the microstate is. As time passes the probability distribution changes and the entropy that you calculate from it either stays the same or goes up. Well, in a certain sense, it does not go up by Liouville's theorem in classical mechanics or unitarity in quantum mechanics ... but it often turns out that you wind up with useless information about the microstate, i.e. information that cannot be translated into a "magic recipe" for undoing the apparent disorder as in the polarization example above. In that case, you might as well just forget that information and accept a higher entropy. See this question.
When you make a measurement, there is a probability distribution for the possible measurement results and the possible microstates following the measurement. Some of the measurement results may leave you with a low-entropy configuration (you pretty much know what the microstate is from the measurement). But if you appropriately average over all possible measurement results, the overall entropy increases on avearge as a result of the full measurement process.
Best Answer
I might see part of the problem here. There are processes in which energy is extracted via heating from a thermal reservoir, and in the process the system does positive work on the environment, and all of the energy coming in via heating gets transformed into work. There are many canonical examples in classic thermodynamics: the main one is an ideal gas undergoing an isothermal expansion.
So when you say
you are correct. This doesn't violate the Second Law at all, for the reasons you have expounded: either the system and the reservoir have the same temperature while they are exchanging energy via heat---in which case the net change in entropy is zero---or the system has a smaller temperature, in which case it is straight-forward to show that the system entropy increases more than the reservoir entropy decreases.
So what is the actual statement of the Second Law here? It is this:
The operative word there is "cycle": if the system has to operate on a cycle, then the entropy increase of the system caused by heat flow from the hot thermal reservoir must be offset by an entropy decrease, as I explain in this answer. This means that the system must expel energy via heating to a cold thermal reservoir, and that is exactly the reason why a perpetual motion machine doesn't exist: some of the energy must be wasted.
This is what people talk about when they talk about perpetual motion machines of the second kind: in order to have "perpetual motion", the system must repeat its motion over and over and over again, forever. In the processes I discussed above where all of the heat is converted into work, the system doesn't reset (it doesn't operate on a cycle!), and so such a machine must eventually stop. On the other hand, if the system does reset (i.e. if it does operate on a cycle), then some of the available energy is wasted every cycle, and so eventually again, the machine must run down and eventually stop.