[Physics] How to calculate the apparent brightness of a mirror reflecting the sun at a given distance

reflectionvisible-light

I've stumbled around trying to figure this out, but it just isn't my area and I haven't gotten far. This is for writing about an idea of a field of mirrors on the moon, on the terminator when the moon is at first quarter, that are aimed so the light is reflected towards the Earth. I am trying to determine how big that field of mirrors would need to be in order for it to be easily visible to the naked eye to a person on Earth within the area where it would be visible.

On the moon the sun has an angular dimension of 0.5o, and the Earth 2o . Taken from Wikipedia's entry on Sunlight, it has an intensity at mid-day of 1050 W/m2, 93 lumens per watt, and an illuminance of 98000 lux.

Based on the angular dimensions of the two bodies, the reflection of the sun on the surface of the Earth would be a circle with a radius one quarter that of the Earth, right? That means a circle with an area very close to 8 000 000 km2. After this point I get confused.

Each square meter of the mirror is reflecting the whole area of the sun over that 8 000 000 km2. So each square meter is distributing 98 000 lux over that area?

From here I found a reference that the formula to convert lux to apparent magnitude is $m=-2.5 log I – 14.2$ , where $I$ is lux, which is lumens per m2. So if I want to determine the size of mirror field so the magnitude is 0, then I need to set it up so $-2.5 log I$ is 14.2, making $I$ about 0.75.

If I am doing this right, I get a result that a mirror field with an area of 61 km2 would be necessary. Is that right?

Best Answer

Assuming you want to notice that there is a mirror on the moon, the question comes down to the resolving power of the human eye.

Let's assume a healthy person with 20/20 vision. When looking at the moon, their pupil is dilated - say 6 mm diameter. The angular resolution of such a pupil is given by

$$\alpha = 1.22 \frac{\lambda}{d} = 1.22 \frac{500\cdot 10^{-9}}{0.006} \approx 10^{-4} rad$$

When you look at the sun in a mirror on the moon, you are basically looking at a very small bit of the sun. The question becomes - how much does the resolution of the eye smear out this bright spot (remember - you can see stars in the night, and they are roughly as bright as the sun but much smaller than we can resolve - that's not a problem against a black background). Because we don't have a "black" background, but a "moon" background, we need the ratio of the intensity of the sun and the moon.

According to Wolfram alpha, the relative magnitude of the sun is +3.23 greater than the magnitude of the moon - in other words, sunlight is about 1700x brighter than moonlight.

Let's assume that the eye needs a 2x difference in brightness to see the mirror on the moon - then we can have an area that is 1700/2x smaller than the smallest area I can resolve.

The resolved disk is $0.0001 * 390,000 km = 39 km$; the disk that is $\sqrt{1700/2}$x smaller has a diameter of about 1.3 km

This assumes that your array of mirrors does a good job reflecting the sunlight; if the reflectance is less than 100%, you obviously need to adjust the above. Also, at the full moon the moon is a good deal brighter than in other phases - there is a funny "spike" in albedo when the moon-earth-sun are roughly aligned (apparently because of small glass spheres on the surface of the moon that act like micro lenses); this means that the answer will be a function of the phase of the moon (and if the array is closer to the edge it needs to be bigger as the above calculation is for "apparent" size, i.e. as seen from earth).