[Physics] Distance of electromagnetic waves based on frequency

electromagnetic-radiationelectromagnetismmicrowaveswaves

Based on the relation between frequency and wavelengths, the higher the frequency, the shorter the wavelengths and therefore the distance of travel is lower. Assuming two frequencies from the ISM band: 2.45 GHz and 5.8 GHz. With the same amplitude and antenna gain, the frequency of 2.45 GHz will travel further than the 5.8 GHz based on their wavelengths.

However, in this link under the subtopic of frequency, it is mentioned that

In general, the higher the frequency, the greater is the range. RF radiation has more energy at the higher frequencies.

In this case, the range does depend upon both the transmitter power and operating frequencies. Does it suggest that higher frequencies have naturally higher amplitudes which may cause them to travel further or higher frequencies needs to be supplied with sufficient transmitting power in order to improve its distance compared to lower frequencies?

Best Answer

In general, the higher the frequency, the greater is the range. RF radiation has more energy at the higher frequencies

That's not correct as stated. Yes, higher frequency photons have more energy than low frequency photons. But your transmitter isn't counting photons. You can have a $1W$ transmitter at low or high frequencies. The high frequency transmitter will produce fewer photons per second, but radiate the same total energy.

The range of radio signals depends on many things including elements of the equipment (antenna, filters, etc.) and the environment (scattering, absorption). While the statement about higher frequency having greater range may be true in the context of normally-available RFID systems, it's not true in general.

Related Question