[Physics] Attenuation of radio frequency transmissions in space

astrophysicselectromagnetic-radiationradiospace

In a science fiction movie some aliens come to earth and when asked why, they say they received our radio transmission and came to investigate, the radio transmission being the first television broadcast (the Berlin Olympiad in 1936). So, the idea is that it took decades for the signal to reach them.

I am wondering how attenuated such a signal would be after traveling for 50 years in space, assuming the transmitter was about 100,000 Watts?

Best Answer

What you are looking for is called free-space path loss.

Let's assume the signal is broadcast from a spherical source (e.g. the signal isn't sent with a directional antenna). Neglecting all other sources of loss (e.g. diffraction, reflection), the simplest way to calculate this loss is $\frac{1}{4\pi r^2}$.

The signal must be received by another antenna as well; therefore we must also take into account the aperture efficiency. This describes the ability of an antenna to pick up a signal, which is dependent on wavelength, and given by the equation $A_{eff} = \frac{\lambda^2}{4\pi}$. A wonderful derivation can be found elsewhere on this site.

All together, this gives us the free-space path loss (FSPL):

$$ FSPL = \left(\dfrac{4\pi r}{\lambda}\right)^2 $$

Where $r$ is the distance between transmitter and receiver and $\lambda$ is the wavelength of the signal.

Useful resource: http://www.radio-electronics.com/info/propagation/path-loss/free-space-formula-equation.php

Related Question