[Physics] Does the temperature of water determine how much heat will be removed from air used to evaporate it

coolingevaporationthermodynamicswater

This is a question about evaporative cooling as used in residential evaporative cooling appliances. This type of cooling uses the heat in the ambient outside air to evaporate water and remove the heat from the air, then push the cooled air inside. The equation to predict the temperature of the resulting air after it's given up its heat to evaporate the water is as follows:

$$T_{output} = T_{dry} – (T_{dry} – T_{wet}) * \epsilon$$

where $T_{output}$ is the output air temperature, $T_{dry}$ is the air temperature of the dry bulb, $T_{wet}$ is the air temperature of the wet bulb, and $\epsilon$ is the cooling efficiency.

For example, on a very dry summer day (dry bulb 95 degrees, wet bulb 60 degrees) my evaporative cooler with 90% efficient media is capable of cooling the air to 63.5 degrees.

However, this equation does not seem to take into account the temperature of the water itself. Does it matter? Intuitively, it would seem to make sense to me that hotter water would be easier to evaporate, since it's closer to its boiling point. Or maybe colder water is better because it will absorb more heat from the air? Or maybe it's a wash because the same amount of heat is required, but with hotter water, more is needed because it will evaporate faster? Help me understand this.

Best Answer

No, the temperature of the water is not that important for the performance of an evaporative cooler. This is basically because the energy needed to increase the temperature of liquid water (its specific heat capacity) is very small compared to the energy needed to evaporate the same amount of water (its enthalpy of evaporation).

At room temperature the specific heat of liquid water is 4.18 J/(g$\cdot$K) while the ethalpy of evaporation is 44.0 kJ/mol. Since the molar mass of water is roughly 18 g/mol, this means that approximately 585 times as much energy is needed to evaporate an amount of water as to increase the temperature of the same amount of water by 1 K. This means that even if the water starts at freezing temperature, is heated to 40 $^\circ$C (104 $^\circ$F) and then evaporates, less than 7% of the energy absorbed by it is used for increasing the temperature.

The temperature of the water might effect the rate at which evaporation occurs, but I guess evaporative coolers are designed to achieve 100% relative humidity in the wet bulb regardless of the temperature of the water, so this probably doesn't affect the performance.