[Physics] How do electromagnetic waves travel away from an antenna

antennaselectromagnetic-radiationelectromagnetism

I have heard that in an antenna the electrons move back and forth and create an electric field which varies with time. This varying electric field in turn, creates a varying magnetic field. These two varying fields together create an electromagnetic wave.

How do these waves travel out from the antenna?

If the electrons are moving back and forth vertically, wouldn't there only be an electromagnetic wave near the antenna? What makes them spread out or expand?

Best Answer

Antenna is a device, consisting of horizontal stick with aplied current on both ends. The current is alternating and varies harmonically (for simplicity) $I \sim I_0 e^{i\omega t}$.

From Ampere's law we know that current will generate harmonically varying magnetic field $B\sim B_0 e^{i\omega t}$, which from Faraday's law will generate alternating electric field $\nabla \times B = \frac{1}{c^2}\frac{\partial E}{\partial t}$, where $E\sim E_0 e^{i\omega t}$.

From Maxwell's equations: $$ \nabla \times ( \nabla \times E) + \frac{1}{c^2}\frac{\partial^2 E}{\partial t^2} = 0 $$
This is a wave equation that tells you how your EM waves propagate. In case when the size of antenna is much larger than the distance between observer and antenna, you will have your wave propagates perpendicular to the antenna in horizontal direction. However, when you will go further and further from antenna, it could be viewed as a point source and your wave will be radially symmetric.

Related Question