No, the temperature of the water is not that important for the performance of an evaporative cooler. This is basically because the energy needed to increase the temperature of liquid water (its specific heat capacity) is very small compared to the energy needed to evaporate the same amount of water (its enthalpy of evaporation).
At room temperature the specific heat of liquid water is 4.18 J/(g$\cdot$K) while the ethalpy of evaporation is 44.0 kJ/mol. Since the molar mass of water is roughly 18 g/mol, this means that approximately 585 times as much energy is needed to evaporate an amount of water as to increase the temperature of the same amount of water by 1 K. This means that even if the water starts at freezing temperature, is heated to 40 $^\circ$C (104 $^\circ$F) and then evaporates, less than 7% of the energy absorbed by it is used for increasing the temperature.
The temperature of the water might effect the rate at which evaporation occurs, but I guess evaporative coolers are designed to achieve 100% relative humidity in the wet bulb regardless of the temperature of the water, so this probably doesn't affect the performance.
Firstly, it would be better to use actual, accurately measured numbers than the human 'experience': the human body is a poor thermometer and the mind plays tricks on us.
But that hot water droplets lose heat and thus cool down in cooler air is an established fact and a consequence of the laws of thermodynamics.
Regarding your three first bullet points, despite some limitations you point out, those do not mean a hot droplet of water doesn't cool in air: it does and partly in accordance Newton's cooling law.
As regards work done by the droplet (overcoming the viscous drag), if anything that would lead to heat generation, not cooling (but the effect is truly minuscule).
Kinetic or potential energy of the droplet have no effect on the droplet's temperature. Temperature is simply a measure of the average speed of the molecules of the water and that is not affected by these energies. Spinning water in an ultra-centrifuge does not make its temperature rise, for instance.
You have however overlooked one major cause of heat loss: evaporation. Your shower 'steams up' because hot water evaporates and that costs energy, known as the Enthalpy of vaporisation.
Millions of tons of water are cooled this way everyday in power plants world wide: the cooling towers drop hottish water from the top of the towers and evaporative heat cools down the water (the evaporated water escapes as steam clouds).
If your shower has been in operation for a long time and the bathroom's temperature is equal to the shower head's water temperature and the air is saturated with water vapour, then no cooling of the shower water would take place.
Best Answer
The wet bulb temperature is defined as the temperature of a parcel of air cooled to saturation (100% relative humidity) by the evaporation of water into it, with the latent heat supplied by the parcel. So if the air is supplying the heat to evaporate water in the cooling tower, it can only cool to its wet bulb temperature, but no lower. This then represents the lowest it can possibly cool the water in the tower to.