[Physics] Effect of Change of Potential Difference applied to an X-Ray tube

electromagnetic-radiationintensitypotentialwavelengthx-rays

I am a high school student, so I know only the basics of X-Rays. I simply know about continuous X-rays, cutoff wavelength and threshold wavelength.
Now if I increase the potential applied to the X-ray tube, I am certain that the minimum wavelength of emitted radiation decreases. I am unsure about its intensity. How does that change?

Best Answer

An X-ray tube makes X-rays by Bremsstrahlung (quickly decelerating the electrons), but also by exciting the atoms of the anode. Just as an electric discharge makes a red glow in neon gas, so spectrum features of the anode metal are prominent in the X-rays produced.

So, to make a higher energy of X-ray, an effective procedure is to change the anode material to something with higher atomic number (the more charge in the nucleus, the higher the binding energy of the inner electrons, thus the higher the possible energy of fluorescence).

In order to stand up to high currents, an anode is usually a good heat conductor (copper) or resistant to high temperature (molybdenum, tungsten). Copper's highest fluorescence is 8.98 keV, molybdenum 20 keV, and tungsten 69 keV.

So if you want a high intensity fluorescence, you need to provide voltages higher than the fluorescence to excite that intense X-ray emission, or provide very high voltages and currents and hope the Bremstrahllung (continuous spectrum) is intense enough (and the anode doesn't melt).

Because the Bremstrahllung includes a lot of low energy radiation, it is usual to use an X-ray tube window that acts to filter out the unwanted low energy radiation. Both Bremsstrahlung and fluorescence energies can be no greater than the energy per electron.

To convert from energy $E$ (keV) to frequency $F$ (Hz) use Planck's constant $h = 4.1\times 10^{-18}\ \mathrm{keV\,sec}$, $$F = E /h$$