[Physics] How well can a laser be focused over interstellar range

laseroptics

I try to read up on diffraction limiting and gaussian beams, but it always gives a result saying it’s characterized by an angle which is the edges of a cross section of a cone. Which says nothing on the physical limits of how I choose that angle. It just describes (once far away from the narrowest point) the shape of a cone: width is proportional to distance from the narrowest spot.

So what’s to prevent you from describing a laser that's arbitrarily tight, at some arbitrary distance, and learning that the required angle is very close to 0?

What, in principle, prevents this from happening?

In Robert L. Forward’s “hard science fiction” novel he based the lightsail description on actual research but did not go into detail in the novel. As a plot point, the senders had to enlarge the size of a focusing device (probably a zone plate) to send the beam for breaking, so the larger sending aperture was necessary for a longer distance. Now they didn’t finish in time due to funding but saved the mission by doubling the light frequency instead. So that sounds like a diffraction effect.

I suppose the relationships of what is possible is simply scaled by wavelength, and once you divide that out there is some relationship between the possible size of the emitter, size of the target, and separation between them? Why does making the emitter larger allow the target to be smaller?

To use some concrete numbers, suppose the target is a lightsail 1 light year away, and it is 1 Mm in diameter. The wavelength in Forward’s story was green light, and if higher frequency allows better focus than the best beam would be the highest frequency that doesn’t start causing issues by breaking bonds in the atoms reflecting it, so just past visible where UV begins. What size emitter (final focusing device) would be needed?

Best Answer

So what’s to prevent you from describing a laser that's arbitrarily tight, at some arbitrary distance, and learning that the required angle is very close to 0?

The half angle of divergence is given by

$$\theta = \frac{\lambda}{\pi w_0}$$

where $w_0$ is the beam diameter at its narrowest point (the waist, or focal point), and $\lambda$ is the optical wavelength.

Typically with a laser the waist point is at the output aperture of the laser cavity, and the beam diverges from there. If you built your laser with a converging output, you'd push the waist point out along the z direction (the direction of propagation) but you'd also reduce the waist diameter, so ultimately increase the divergence angle.

So you can't choose to produce an arbitrarily small divergence angle unless you're prepared to build a laser with an arbitrarily large output aperture.

To use some concrete numbers, suppose the target is a lightsail 1 light year away, and it is 1 Mm in diameter.

1 light year is about $10^{16}$ meters. So you need a divergence angle on the order of $10^6 / 10^{16}$, or $10^{-10}$ radians. You need a beam waist of

$$ w_0 > \frac{\lambda}{\pi \theta} $$

If your wavelength is 500 nm, this means a waist of at least 1600 m. In practice I expect there would be "unique engineering challenges" in designing optics close enough to ideal to achieve this kind of divergence. I've never heard of beam divergence being measured in units smaller than milliradians, but I don't know what's been achieved in hero experiments.