[Physics] Why do radio waves spread out while higher frequency waves travel in beams

electromagnetic-radiationelectromagnetism

Why is it that radio waves spread out in proportion to the square of the distance, while higher frequency electromagnetic waves, like microwaves, infrared waves, light, etc are able to propagate as beams? What fundamental property allows higher energy waves to travel differently than lower energy?

Best Answer

Due to diffraction, wave effects become more important as the size of the wave source becomes comparable to the length of the wave. Visible light has micrometer-scale wavelengths, so a millimeter-sized light source is thousands of wavelengths across and diffraction isn't a very big deal. But radio wavelengths can be many meters, producing similar collimation for a radio "beam" would require an emitting antenna hundreds or thousands of kilometers across.

You can use the same logic to think about shadows. A hair that's less than a millimeter across can cast a well-defined shadow, while radio waves diffract around buildings. However larger objects can cast well-defined radio shadows: for instance astrophysical radio sources disappear when they are covered by the Moon or the Sun, which are both very many wavelengths across.

Note that even "collimated" light undergoes dispersion. Any sort of focusing optical system will produce a beam waist at some finite distance from the final focusing element (mirror or lens or whatever); beyond that beam waist the intensity of the light falls off like $r^2$ just as if a light source were at that location. A perfectly collimated beam of light is prohibited by the uncertainty principle, unless the beam is infinitely wide.

Related Question