[Physics] need to adjust TV antenna on the roof for better video quality

radio frequency

I find it a annoying and a pain to climb up to the roof and adjust the TV antenna so that I can watch my favorite TV program without distortion. I am not using satellite service which requires a dish to capture the signal and I know the antenna is designed to receive radiowave signals from the nearest broadcast tower and my TV will be tuned to a specific frequency. My question why does the orientation of the TV antenna in-situ affect the picture quality?

stock photo of a UHF antenna

Image is taken from Wirelesshack, please note that mine doesn't look exactly like this but close.

Best Answer

Picture the radio waves from the TV transmitter as flat horizontal sine waves. You want the antenna to pick up the full width of this wave in the horizontal bars. So you need to point the antenna as directly as possible towards the TV transmitter.

dipole animation from wikipedia

Cell phones send the signal in a vertical wave (apologies for the oversimplification). This doesn't allow you to pick up a clear signal from as far away, but does mean that you can simply hold the antenna vertically and be able to receive a signal from any direction. Cell phones also contain a lot of modern signal processing electronics to handle a weak signal, which weren't available when TV was designed

P.S. You presumably only need to adjust it if you want to get a signal from a different transmitter. Or perhaps if the signal is really weak and the atmosphere slightly changes the direction the strongest signal is coming from.