As in User58220's answer, if the source is a point or small (i.e. diverging beam) source, the intensity varies as a real source placed at the virtual source's position - almost. There is a further increase in intensity owing to the Doppler shift alone.
So if your source is a collimated beam at right angles to the mirror's surface, all the frequency components in the beam are going to be blue shifted by the square $\frac{c+v}{c-v}$ of the more wonted Doppler frequency scale factor $\sqrt{\frac{c+v}{c-v}}$ that holds if the source is real (i.e. not a virtual one made by a mirror) moving towards you at speed $v$.
At the same time, the light's power intensity is scaled by this same factor $\frac{c+v}{c-v}$. You can think of this intensity scaling as arising from a conservation of photon number, but the reflected photons are now more energetic as the mirror inputs work to the beam. The result can also be derived classically using Lorentz transormations of the relevant electromagnetic boundary value problem and you can see details of this calculation in my answer to the Physics SE question "Can Planck's constant be derived from Maxwell's equations?". Indeed one can use this result (equality of Doppler and power scales) to motivate the idea that a photon's energy must be proportional to its frequency, so you can in a sense derive the quantum reasoning from the classical, although this reasoning won't tell you the value (i.e. $h$) of the scaling constant.
The Earth receives approximately $6.8\text{mW/m}^2$ of reflected sunlight from the moon (see below for details of how I calculated that).
However, the sunlight is also absorbed by the moon and this raise the surface temperature. So the moon also emits thermal radiation towards the Earth (assuming the highest day time temperature of 400K, see comments below for more information), $\epsilon_{\text{moon}}(1-A)\sigma (400K)^4 = 89\text{mW/m}^2$
So the total power received from the moon (reflected + thermal) is 10,438 times weaker than sunlight, i.e.
$$
\frac{6.8\text{mW/m}^2 + 89\text{mW/m}^2}{1000\text{W/m}^2} = \frac{1}{10438}
$$
To answer your question about how much that heats the Earth, let's assume that the average daytime temperature of the Earth is 20$^\circ$C and the average nighttime temperature is 10$^\circ$C (these estimates could be improved, but it doesn't really change the answer significantly).
Therefore the incident solar energy causes a temperature difference $\Delta T=10^\circ$C between night and day. So we know that 1000 $\text{W/m}^2$ (solar irradiance on the Earth surface) cause a temperature increase of around $10^\circ$C. Let's assume that moonlight will also cause a temperature difference but one that is scaled proportionally by its intensity. Moonlight is 10,438 (reflected and thermal energy) times weaker than sunlight, the change in temperature of the earth from absorbing moonlight is,
$$
\frac{10^\circ C}{10,438} = 958 \mu K
$$
Good luck measuring that...
Assumptions and method
- Solar irradiance is 1000 $\text{W/m}^2$ at the surface of the moon and the earth.
- The reflectivity of the moon is about $A=$10%.
- The solid angle of subtended by the moon in the sky is the same as that subtended by the sun $\epsilon_{\text{moon}} = 6.8\times10^{-5} \text{Sr}$. I say this because during an eclipse they appear to the same size so it's probably quite a good assumption.
From 1 and 2 we know that $100\text{W/m}^2$ is reflected at the surface of the moon. From 3, let's multiply that by the solid angle subtended by the moon as viewed from the Earth as this will give us the amount of the reflected energy that hits the Earth. So, $100\text{W/m}^2 \times 6.5\times10^{-5} = 6.5\text{mW/m}^2$.
Best Answer
No, because of the sizes of their surfaces. Let's make these simplified assumptions:
Then the total amount of sunlight received by the sunlit hemisphere of Earth is proportional to the square of the Earth's radius, while the total amount of sunlight received by the sunlit hemisphere of the Moon is proportional to the square of the Moon's radius. Since the Moon is ≈1/3.67 the radius of Earth, it receives ~1/13.5 the total amount of sunlight.
Certainly, even a perfectly reflective Moon can't reflect more sunlight than it receives, so even if all of the light bouncing off of the Moon reached the Earth it would only provide brightness comparable to a cloudy day.
Of course, owing to the geometry, most of the light bouncing off of the Moon doesn't land on Earth; it goes off into space in directions that miss the Earth completely. Making another simplifying assumption, I think we can say that the fraction of it that reaches Earth is proportional to the fraction of the Moon's sky taken up by the Earth. The Earth has an apparent size of about 2 degrees as seen from the Moon, so its angular size is $2\pi\left(1 - \cos\frac{2^\circ}{2}\right) \approx 0.00096$ steradians. A hemisphere is $2\pi$ steradians, so the Earth occupies about 0.00015 hemispheres (about 0.015% of the Moon's sky). Now we have that geometrically, a perfectly reflective Moon should illuminate the Earth at about $\frac{0.00015}{13.5} \approx \frac{1}{90,000}$ the intensity of the Sun.
In real life, the light from a full Moon is about 1/480,000 the brightness of the noon Sun. Given the Moon's albedo is somewhere between 0.1 and 0.2 depending on the angle of incidence, and given the huge simplifications made in the above math, I think this indicates that we're in the right ballpark.