[Physics] EM waves: How do they travel for billions of km without damping

astronomyelectromagnetic-radiationstarsvisible-light

If a star is 1 billion light years away, it means that the light we see from the star is emmitted billions of years ago.

How does this light not undergo a frequency change or get damped inspite of the collosal distance of travel? I am asking this as doppler effect is used to know star-distances and is one of the basis for big-bang theory

Also it travels in vacuum. According to Maxwell/Gauss, a charged particle is responsible for electric field whose change creates magnetic filed. Inturn, its change causes Electric field, a process that occurs infinitely. Even if we assume that the energy travel in vacuum as the energy produced from a charged particle elsewhere is used for propagation in vacuum, does the waves not damp significantly?

In the radio waves emitted/transmitted for earthly communications, power of source is directly proportional to the distance travelled by the wave. Is this not contradicting the fact that stars light reach us from the humongous distances?

Best Answer

The light from distant galaxies does undergo a frequency change. It is red shifted, and the amount of red shifting is used to work how fast the distant galaxy is receding and therefore how far away it is. However this is not a damping effect. The light red shifts because the spacetime in between us and the distant galaxy is expanding, so although the light's energy is conserved it is spread over a larger distance.

You ask why light isn't damped, but why should it be? Since energy is conserved the light wave can't lose energy unless there is some mechanism to carry the energy away. For a light wave travelling in vacuum there is simply no mechanism by which it can lose energy, so it doesn't.

In your last paragraph I think you're getting mixed up with a different effect. When a terrestrial radio station broadcasts it sends the radio waves out as a half sphere (the half above the ground). As you get further away from the transmitter the field strength of the radio waves decreases as the inverse square of the distance because the energy of the transmitted wave is spread over a larger area. However the total energy of the light wave is conserved. This obviously happens with distant galaxies as well because the more distant a galaxy is the fainter it appears. This isn't due to damping, it's just the inverse square law dependance of the field strength.