Quantum Mechanics – How the Single Slit Experiment Demonstrates Heisenberg’s Uncertainty Principle

diffractionheisenberg-uncertainty-principlequantum mechanicswave-particle-duality

I saw some videos where a person points a laser through a slit. As they reduce the width of the slit, the diffracted image spreads out, like this:

enter image description here

Can this pattern be viewed as a consequence of Heisenberg's uncertainty principle, applied to photons?

Best Answer

Yes, light diffraction may be viewed both as a classical phenomenon and as a quantum mechanical consequence of Heisenberg's uncertainty principle. However, since both explanations work equally well, it doesn't provide any direct evidence for quantum mechanics.

Let me explain why the two explanations are equivalent. I'll do the classical uncertainty bound first.

You may have noticed that in far-field diffraction, the product of the width of the aperture and the size of the pattern on the screen is a constant: a slit half as wide makes a pattern twice as big. In fact, plugging in the formulas for any kind of diffraction whatsoever will give something like $$\sigma_{\text{slit}} \sigma_{\text{screen}} \gtrsim D \lambda$$ where $D$ is the distance to the screen. This is the "uncertainty principle" for classical diffraction.

Now let's view this on the quantum level. The position uncertainty is simply $$\sigma_x = \sigma_{\text{slit}}.$$ The momentum is given by the de Broglie relation $$p = \frac{h}{\lambda}$$ but we want the uncertainty in the $x$-component of the momentum, $p_x = p \sin \theta \approx p\theta$. Now, the uncertainty in the angle $\theta$ is just $\sigma_{\text{screen}}/D$ by trigonometry, so we have $$\sigma_{p_x} = p \sigma_{\theta} = \frac{h}{\lambda} \frac{\sigma_{\text{screen}}}{D}.$$ Putting this all together, $$\sigma_x \sigma_{p_x} = \frac{h}{\lambda D} \sigma_{\text{slit}} \sigma_{\text{screen}} \gtrsim h.$$ Up to a constant, this is the usual Heisenberg uncertainty principle; the two pictures are equivalent.


From a mathematical perspective, both uncertainty principles listed above special cases of a more general fact: the product of the width of a function $f$ and the width of its Fourier transform is bounded.

In the quantum case, the Fourier pair is position and momentum. In the classical far-field diffraction case, the pair is the screen and source, as proven here. This result also has applications in signal processing.

Related Question