Electromagnetism – How Does a Falling Electron Behave?

electromagnetismgravity

Suppose there are two objects in the universe. Earth, with a gravitational acceleration of g = 9.8m/s/s, and a typical electron.

The electron is dropped from a certain height, say 1000m above the Earth's surface.

The initial energy of the electron is only the potential energy, $mgh = m_eg\times1000$, where $m_e$ is the mass of the electron.

As the electron falls towards the earth, it will be accelerated and thus will radiate energy. Will this cause the electron to slow down, and thus will the electron take a longer time to hit the ground than that expected by the equation $s = 0.5at^2$, due to energy loss through radiation.

If so, what acceleration will the electron actually fall at? How long will it take to hit the ground?

Best Answer

Assuming non-relativistic velocities, the power radiated by a charge accelerating at constant acceleration $a$ is given by the Larmor formula:

$$P = \frac{e^2 a^2}{6\pi \epsilon_0 c^3} $$

To do the calculation properly is surprisingly complicated, but it's easy show that the effect of the radiation on the electrons fall is negligible. If the electron falls a distance $h$ then the time it takes is given by:

$$ h = \frac{1}{2}gt^2 $$

so:

$$ t = \sqrt{\frac{2h}{g}} $$

If we assume the electron is accelerating at a constant rate of $g$, the total energy radiated is just power times time or:

$$ E_{rad} = \frac{e^2 g^2}{6\pi \epsilon_0 c^3} \sqrt{\frac{2h}{g}} $$

In your question $h$ is 1000m, so:

$$ E_{rad} = 7.83 \times 10^{-51}J $$

The potential energy change is, as you say, just $mgh$:

$$ E_{pot} = m_e g h = 8.94 \times 10^{-27} J $$

So the ratio of the radiated energy to the potential energy is about $10^{-24}$, and therefore the effect of the radiation on the electron's fall is entirely negligible.

Response to comment:

The power radiated from the electron produces a force that opposes the acceleration due to gravity. Assume we can ignore the deviations from accelerating at a constant rate $g$, then in a small time $dt$ the energy radiated is $Pdt$. The energy is force times distance ($dx$) so to get the force we divide by the distance:

$$ F = P\frac{dt}{dx} = \frac{P}{v} = \frac{P}{\sqrt{2gh}} $$

using $v^2 = 2as$. The acceleration produced by this force is just $F/m_e$, so the net acceleration on the electron is:

$$ a_{net} = g - \frac{P}{m_e \sqrt{2gh}} $$

So the electron does accelerate slightly more slowly than $g$, but the difference between the acceleration and $g$ is inversely proportional to distance fallen so it gets increasingly negligible the further the electron falls.

You've probably spotted that the above equation says the force should be infinite at the moment you release the particle. That's because as you approach the moment of release it's no longer safe to make the approximation that you can ignore the change in the acceleration due to radiation.

Related Question