[Physics] Loss of Power at high frequencies

powersignal processing

One of my work colleagues told me that a cable he is sending a signal through is losing power at high frequencies. So he recommends the signal should be amplified before being sent. The explanation was given for the power loss is that higher frequency signals are lossier.

As a newcomer to signal processing, I'd like to understand more about why or how this effect occurs. How is the frequency of the signal causing the power loss to occur?

From Wikipedia's article on Coaxial cable I found this, which seems promising:

If an ordinary wire is used to carry
high frequency currents, the wire acts
as an antenna, and the high frequency
currents radiate off the wire as radio
waves, causing power losses.

Is understanding how antennae work key to understanding why the high frequency results in power loss?

Best Answer

Losses in coaxial cable are resistive. For low frequencies, one uses the full thickness of the coaxial cable and resistance is low. As frequencies increase, the signal is unable to penetrate as deeply into the conductor. This is called the skin effect.

So as frequencies increase, the amount of metal that is used to carry the signal decreases. The result is an increase in the resistance and hence higher losses.

Related Question