People are claiming that a Plasma TV uses more power which explains why it uses more heat. We refer to power as wattage in this regard. Countering this, is it possible that a device can draw the same wattage or less and produce more heat? Basically, can something create more heat than something else with the same or less drawn power?
[Physics] Is it possible for an electrical device to generate more heat than another using less power or the same
thermodynamics
Related Solutions
A ceramic heater works by convective heat transfer, i.e., the energy consumed directly goes towards increasing the kinetic energy of air molecules, which are then blown into the surrounding space. The warmth felt due to such a heater is due to molecules in that warm air colliding with your skin, causing random motion of molecules in your skin due to the collisions.
In contrast, an infrared heater with a tungsten filament or other resistive element that grows red hot works by emitting electromagnetic radiation in the infrared range. The energy consumed by an infrared heater mainly goes into producing that electromagnetic radiation. When that electromagnetic radiation interacts with the molecules of your skin or other surface, it causes random motion of the molecules in your skin (heat) by causing the molecules to transition to a higher-energy vibrational state.
Which one will feel warmer sooner depends on the circumstances. An infrared heater can be more efficient by transferring heat energy directly to your skin, which is where you want it to be, instead of wasting energy by warming air molecules that might not even ever wind up subsequently hitting your skin, for example if you're in a drafty garage. On the other hand, the electromagnetic radiation emitted by an infrared heater travels in a straight line, so you might not be directly heated by the electromagnetic radiation if you're in a part of the room where there isn't an uninterrupted straight line between the heating element and your skin.
If you're measuring heater effectiveness by measuring how long it takes to get the air in a room up to a certain temperature, a convective heater is likely to win, because heating air is what a convective heater does directly. An infrared heater heats whatever the radiation hits, which means you could for example be wasting energy by heating a wall or something directly, instead of nearly all of the energy going into heating air.
That is, a device which transfers heat from a colder body to a hotter body and does a positive net work on the surroundings during the cycle.
Although this statement of the second law is rather imprecise, the device you describe (which I will call a refrigeration engine) is indeed forbidden by it.
We already know of functioning devices which can take work and convert it to thermal energy, which we can generically call heaters. So, with your refrigeration engine, you could couple it with a heater in the hot reservoir to further increase the heat in the hot reservoir. Then you would wind up with a combined system that violates Clausius’ statement.
Now, since we know that heaters exist, and by Clausius’ statement the combined refrigeration engine/heater cannot exist, we can conclude that the refrigeration engine cannot exist.
Best Answer
Conservation of energy says that the sum of all energy in and out of a system must be equal to the change of internal energy of that system. So if you say that the system (say, your TV) remains in the same state (it doesn't get warmer over time), then the power in must equal the power out.
Now for an electrical device that gets all its power in through the electrical supply, there are broadly three different ways for power to come out.
Very often, "heat" is an unwanted side effect of the conversion of electrical power into "something useful" - for example, an incandescent light bulb converts a lot of the power into heat, and only a little bit into visible light.
It is entirely possible that different devices will have different efficiencies, and will therefore produce a different amount of heat - it all depends what fraction of the electrical energy coming in is being converted to the other types of output.
For the case of the plasma TV, the mechanism for generating light is somewhat inefficient - see for example http://electronics.howstuffworks.com/plasma-display2.htm . Ionizing the gas so it can emit light requires quite a lot of power - so although the plasma TV excels in its range of intensities, color, and response time, all this comes at the expense of power consumption. By contrast, a LCD based screen uses a relatively efficient source of lighting (for example, LED or fluorescent lighting) and then uses the LCD to stop some or all of that light from reaching the viewer. So a LCD screen is "always on" - that is, always generating light - but for a dark scene it will re-absorb most of the light generated. A plasma screen will actually use less energy for a dark scene - it is simply not generating as much light to begin with.