[Physics] Earth curvature refraction for dummies

curvaturerefraction

I keep being presented with 'earth curvature experiment' videos recently, by flat/concave earth advocates. It seems to be their favorite "evidence" that Earth is not spherical. Debunking this gets into math which I stink at, regarding refraction. I would like to be sure I have considered all the factors. And I keep looking for a simple rule of thumb, 'for each mile, the line of sight drops this amount, due to refraction' value.

So for example, this '20 mile laser' experiment – https://www.youtube.com/watch?v=O8auPtlRTyM proposes to defy Earth curvature. It is a sloppy experiment, but assuming they are not cheating, and their horizontal line of sight is 65 feet above sea level as indicated, does it make scientific sense to be able to see the laser from 20 miles, when the geographic distance to sea level is 10 miles? And how to explain why it does, in simplest terms?

I am also wondering if the laser could be "skipping" over the surface, kinda like a skipping rock, but in a few, long arcs. So the laser grazes the water near the sea level peak say around 10 miles, then is reflected back upward, and arcs over the 'hump' due to refraction. Thus it covers even more distance than if it were traveling over level ground that does not reflect like water.

Sites explaining refraction calculation are too mathy for me. And I seem to be reading conflicting rules and I am getting different end results. I understand there are several variables, but this video indicates that surveyers use a standard 7% rule- https://www.youtube.com/watch?v=4LPcNxYwhQo But that does not seem to account for the 20 mile laser example. And I understand that the closer to the water, and the greater the distance, the more the laser will refract, or 'super refract.'

WikiP entry on "horizon- effects of refraction- says something about a 4/3 ratio, and 15% beyond geometrical horizon. And 'standard' atmospheric refraction is 8%, although that is not 'super' refraction as I assume would apply to the laser example. But this does not match the 7% rule, and I dont get how all the values are used to get a definitive refraction value.

I see a simple 'Distance to horizon calculator' (this forum wont allow me to post the link, will try to post in another comment) but does not figure refraction.

And another one is suppose to calculate refraction 'lapse rate' , but no matter what I enter, the lapse rate always comes out "0." Doesnt seem to work.

Any clarification is much appreciated.

Best Answer

It seems your question comes down to "Why does light at least somewhat follow the curvature of the earth?".

The answer is indeed refraction. Light has different speeds in different transparent substances, always slower than in vacuum. From this differing speed, you can show that a light beam is bent at the boundary between substances with different index of refraction, which is the ratio of how much light slows down in the substance compared to vacuum. Camera lenses, eyeglasses, etc, harness this principle deliberately.

The speed of light in air is close to that in vacuum, but not exactly the same. Put another way, the index of refraction of air is almost 1, but not quite. Furthermore this index of refraction varies with the density of the air. To convince yourself of this, imagine the limiting case where you measure index of refraction of air as the pressure is gradually lowered. When it gets to 0, the index of refraction must be 1 by definition. The index of refraction of air therefore varies smoothly as a function of pressure.

Now think of the air envelope around the earth. Obviously there is a pressure gradient with altitude. When you get high enough, the atmosphere is gone and you have only the vacuum (almost) of space. In this case there isn't a sharp boundary like there is when light enters a glass lens. However, the gradient still bends light, in this case smoothly over some distance, as apposed to abruptly at the air/glass boundary in the lens example. This vertical pressure gradient, and therefore index of refraction gradient, causes light to bend a little when shot horizontally thru the atmosphere.

However, there is more to it than this general effect. The atmosphere is not uniform at any one altitude. As you know, there is wind, pockets of hot and cold air, rising thermals, cold downdrafts, and lots of phenomena that are much more significant locally than the general decrease in pressure vertically. The air can have different layers at different temperatures, and the interface between layers can be much more abrupt than the general trend of decreased pressure with altitude.

Shooting a light beam with the right atmospheric conditions can exhibit much more bending than in the general average case. A mirage is a good example of this. Light from the horizon is refracted by the relatively sharp boundary at the top of a thin hot layer of air warmed by the ground. From far enough away to that the light is at a very glancing angle, you "see" sky light reflected off of what looks like the ground. This gives the visual impression of a lake, since a lake would similarly reflect sky light in normal cases even when there are no special atmospheric effects.

In the case of a mirage, light is actually bent upward. Light can just as well be bent downward using similar boundaries of layers in the atmosphere. It depends on the position of the emitter and receiver relative to the index of refraction gradients in the atmosphere.

If you were to carry out these experiments on the moon, which has no practical atmosphere for this purpose, light beams would indeed to "straight". You won't see mirages on the moon, for example.

Related Question