Heat of vaporization is related to enthalpy change, while dew point is related to free energy change, i.e. enthalpy plus entropy. That's why they are very different concerning relative humidity.
The enthalpy of a gas is more-or-less independent of pressure or partial pressure, because gas molecules don't really interact with each other. At insanely-high pressures there would be some effect on enthalpy of course, but the effect at everyday pressures is very low. Pressure mainly affects a gas via entropy not enthalpy.
The enthalpy of a liquid is somewhat dependent on total pressure: A high pressure will push the molecules closer together and therefore change their interaction energies. But obviously the enthalpy of the liquid doesn't depend on what the gas partial pressures are, it can only depend on the liquid's own total internal pressure.
So the answer is: Heat of vaporization, being related to enthalpy not entropy, has essentially no dependence on relative humidity. (given a constant total air pressure)
-- UPDATE --
Oops, whenever I wrote "enthalpy" I should have said "enthalpy per molecule" or "enthalpy per mole" ["molar enthalpy"]. You can check for yourself that the enthalpy per molecule of an ideal gas is independent of pressure or partial pressure. For a real-world gas, it's approximately independent. The "per mole" quantities are what matter for dew point etc.
The name of the property is itself a clue here : enthalpy of vaporization. By nature, enthalpy does take into account the work required to push against the atmosphere.
You can see the impact of increasing the pressure on the enthalpy of vaporization on a Mollier diagram. Increasing the pressure has the overall effect of reducing the enthalpy of vaporization, until it becomes zero at the critical point. At this stage, there is no longer a phase change associated with vaporization.
Best Answer
Liquids evaporate at any temperature - not just at their boiling point. This is the reason, for example, why wearing a wet shirt on a windy day makes you so cold: the water evaporates, and in the process "takes some heat with it".
The explanation for this is simple when you think about statistical thermodynamics. You have a lot of molecules whose energies follow a statistical distribution. The fastest of these molecules can escape the forces holding the liquid together; but in the process, they take away "more than the average amount of energy". It's like the smartest person leaving the room: the average of the room just got a little dumber.
Now the rate at which this happens is a strong function of the temperature (and the relative humidity of the air): however, it does happen at all temperatures. Therefore, you can express the average "surplus energy" that each molecule takes with it as a function of temperature - and when you scale this to unit mass, you get the latent heat of evaporation. At any temperature.
Note this appears to be in contradiction to LDC3's answer. One of us is wrong... I believe heat of vaporization is a perfectly valid concept when you are not at the boiling point.