[Physics] Relationship between slit size and wavelength in diffraction

diffractionwaves

Almost in every book on physics we can find a statement like "diffraction gets stronger when the size of the slit is comparable to the wavelength". Let's say we have a wall in a bathtub with a slit in it. When water waves reach the slit, the books usually come up with the Huygens' principle to explain that the points on the wavefront which are near the edges interfere in some fancy way so that the water waves spread out radially. However I do not see any connection between the size of a slit and the wavelength. I do understand that if we have a really tiny slit, there are very few points on the wavefront which produce radial waves, and if the slit gets bigger, there are more points that produce the same secondary waves which eventually interfere in an interesting way. But what it has to do with the wavelength? If we have a tiny slit and even more tiny wavelength, how does this change the game?

So for example, if we have a big wavelength, the diffraction would be:

enter image description here

But if the wavelength gets smaller while the slit remains the same, I'd expect the very same diffraction with the only difference that the resultant wave will have a shorter wavelength:

enter image description here

Best Answer

If you agree about the definition of a "score" $S$ that tells you how strong and important diffraction is, it is clear that the score is only a function of the ratio of the slit $h$ size and the wavelength, isn't it? $$ S = S(h/\lambda) $$ It's because the basic laws of the propagation of waves are scale-invariant – you may increase the size of everything 150 times and nothing changes qualitatively.

On the other hand, if you only increase (or decrease) the slit size; or you only increase (or decrease) the wavelength, $S$ will change.

For example, if the slit size (or distance between two slits) is much smaller than the wavelength, it effectively acts like a single point and the waves propagate radially in all directions. If the slit size is much larger than the wavelength, it's like "no slit" – free propagation – and rays may propagate through the slit (almost) without disturbance.

Strong diffraction or interference is observed if the wave at some point of the screen is a combination of comparably strong waves whose relative phase shift is sufficiently large (not just 0.0001 radians); but it is also sufficiently small (not 500 radians). If it were 500 radians, the phase shift modulo $2\pi$, and this is the only thing that matters, would be a random number and the wave phenomena would be invisible, too.

It means that the strongest wave phenomena occur if the relevant phase shifts are comparable to 1 radian or 90 degrees or something of this order. That occurs iff the slit size or the distance between the slits is comparable to the wavelength because the phase shift is $\Delta s/\lambda$ where $\Delta s$ is the difference between the lengths of the interfering rays' trajectories.