[Physics] Refractive index variation with wavelength of light

electromagnetic-radiationopticsrefractionwavelength

In my book it is written that

$$\text{refractive index} = c/v =\sqrt{\text{permeability} \times \text{permittivity}}$$

The dielectric constant medium is dependent on the frequency of the field. For example for dc fields, the dielectric constant of water is 80.4 while in the frequency range of visible light it is 1.8 only. It is because in highly oscillating fields the dipoles fail to orient with the oscillating field.

According to this refractive index should increase with increasing wavelength

But also

$$\text{refractive index} = a + b/\mathrm{wavelength}^2,$$

which decreases with increases wavelength. How is this possible?

Best Answer

$a + b / \lambda^2$ is just an approximation. It’s typically used in a limited range, e.g. across visible light frequencies.

The approximation breaks down outside that range: X-rays don’t refract strongly in glass.

Physically, the $1/\lambda^2$ term, perhaps better seen as an $\omega^2$ term, is due to the dipoles “falling behind” the incident wave. At higher frequencies, that lag matters more. At high enough frequencies, the dipoles start to behave in other ways, and the approximation breaks down.

More formally, the response of a material has poles and zeros, and this is an approximation to how you approach one.