[Physics] Do microwaves (photons) lose energy as they heat

dissipationelectromagnetic-radiationmicrowaves

I understand that microwaves are able to heat water because their alternating electromagnetic fields flip water molecules (dipoles) up and down which increases the waters kinetic energy (thus increasing temperature).
Doesn't that mean the microwave photons are then losing energy due to law of the conservation of energy?

In other words, if I took a single microwave photon (~30 cm wavelength) and passed it through a cup of water, would it come out the other side with a larger wavelength (because it lost energy interacting with the H2O's)?

Best Answer

First, lets calculate the energy per photon for a 30 cm radiation:

$$E = \frac{hc}{\lambda} = \frac{(6.62\times10^{-34}\ {\rm Js})(3\times10^8\ {\rm m/s})}{30\times10^{-2}\ {\rm m}} = 6.62\times10^{-25}\ {\rm J}$$

That means that a 1000 W microwave oven is producing in the neighborhood of $1.5\times10^{27}$ photons per second (or $6.1\times10^{26}$ photons per second at the more typical microwave oven wavelength of 12 cm), and the energy associated with each individual photon is miniscule.

When the water absorbs energy from the electromagnetic wave, it absorbs whole photons rather than taking only part of the energy from a photon. The vast number of photons involved in the interaction means that the energy in the EM wave can essentially change continuously (the water can absorb 10% of the energy or 10.01% or 10.10203910% of the energy without having to only "partially absorb" a photon).