Electromagnetic Radiation – Why Do Electromagnetic Waves Become Weaker with Distance?

electromagnetic-radiation

From what I know, In electromagnetic waves, the components electric field and magnetic field generate each other i.e. magnetic field generated from moving electric charge generates another electric field and which also generates another magnetic field and so on thus, creating an electromagnetic field. So each of them gets weaker with distance which results in weaker EMF. Am I correct here?

Best Answer

The weakening or otherwise of EM radiation is purely a matter of geometry.

If you have an isolated point charge the field lines travel radially outwards from the charge. At some distance $r$ from the charge the field lines are spread out over a sphere of area $4 \pi r^2$ so the field line density, i.e the field strength, is proportional to 1/$r^2$. Hence the inverse square law.

On the other hand, if you had a perfectly collimated laser its brightness would be unchanged no matter how far the beam travelled. A perfectly collimated laser is impossible for various reasons, but if you've been to any rock festival you'll know that lasers stay bright for a large distance (though they may be attenuated by the herbal smoke they travel through).