[Physics] Laser beam divergence

laseroptics

How can I calculate the divergence of a laser beam by diffraction? I want to find the following relation,

$$
\Delta\Omega = \frac{\lambda^2}{A},
$$

where $\lambda$ is the wavelength and $A$ is the area.
laser in cavity

this is the laser in a cavity.

Best Answer

I will assume that you are asking about laser beams in the fundamental, diffraction limited Gaussian mode. The standard expression for the divergence angle of a Gaussian beam in the far field is (see the Wikipedia page on Gaussian beams) $$ \theta=\frac{\lambda}{\pi\omega_0} $$ where $\omega_0$ is the so-called waist size of the Gaussian beam. From here you can calculate the solid angle subtended by the beam which is given, in the small $\theta$ limit, as $$ \Theta\simeq\pi\theta^2=\frac{\lambda^2}{\pi\omega_0^2}=\frac{\lambda^2}{A}, $$ where $A$ is the area of the beam's waist.

If you are looking for a derivatation which starts at a more fundamental level than that, then you should pick up any textbook on lasers. Any textbook you can find will cover the derivation of the Gaussian modes of a laser beam from the wave/Helmholtz equation. You can also look at section 2.1 of my thesis where I sketch out the derivation from the Helmholtz equation although I stop slightly short of deriving the divergence angle.

Related Question