Energy – Is Using a Heating Element Instead of Doing Useful Work Wasteful?

computational physicsenergy

Consider a computer CPU consuming electrical energy to perform calculations and consequently emitting heat.

Assumption: That a CPU consuming x Watts of power, emits the same amount of heat as an incandescent heating element which consumes the same amount of power.

The rest of the question assumes that this is true: the energy has to go somewhere, and I can't see that the outcome of the calculations counts as "somewhere".

I've long repeated the claim that "wasteful" appliances such as incandescent lightbulbs, over-full kettles, computers left running, etc. are not wasting power if we make the assumptions:

  • the appliance is in a room which we aim to maintain at above a certain temperature, warmer than the external temperature (cf a home in winter)
  • the room leaks heat to the outside faster than the "wasteful" appliances emit heat.
  • the room contains an electric heater
  • the electric heater is regulated by a thermostat

In these circumstances, I think (?), if a light bulb produces excess heat, its only effect is to cause the thermostat to switch off sooner, so what energy you've "wasted" in the light bulb, is "saved" at the heater.

I don't think that's controversial.

But my question is:

  • If a CPU does useful work, and emits all the input energy as heat as a side effect
  • But a heating element emits the same heat, without doing useful work
  • Is it therefore wasteful to use heating elements? Is it more efficient to (for example) run folding@home on a CPU, to get "free" computation as a side-effect of heating?

One reason to heat a room with a heating element, rather than with CPUs, is that the heating element is much cheaper to produce. But ignoring the initial cost of the hardware, would a "heater" made out of 20 CPUs, each running at full capacity and consuming 50W each, output less thermal heat than a 1KW element that does nothing else useful?

To be useful, of course, the CPUs would need to output the result of their computations, which would account for a tiny amount of energy. Let's assume that's negligible.

I'm sure there's a flaw in there somewhere; probably that the assumption above is false. What have I missed?

Best Answer

There's no flaw in your argument. A computer heats the room just as effectively as an electric heater of the same power and you could use the computer to do something useful (Bitcoin mining?) while it's heating your room.

There are some practical considerations, though I think these have been sufficiently discussed in the comments. Computers would make for noisy and bulky heaters, though anyone who has had to refrigerate a server room will tell you they can be very effective heaters.

If you want to be really, really, pedantic the computer may not be quite as effective as a heater because an infinitesimally small amount of it's power may go into the data on its disk drive. See Is a hard drive heavier when it is full? for a discussion of this, but note that I mention this for its amusement value rather than because I seriously believe it's an issue.

Related Question