[Physics] Why is there a photon every millimeter on earth coming from a star millions of light years away

electromagnetic-radiationelectromagnetismphotons

I've been having this confusing thought for so long now it would be amazing if someone could answer me.

Imagine this asterisk * . As you see, from the center point, lines go outwards, just like a sun will emit rays of light in all direction.

BUT, theoretically, there should be a "finite" amount of photons it sends in space, which means that the farther you are from that sun, let's say 1 million light years, the less likely are your eyes of catching the photons emitted from the source?

So does this make the theory of the photon bad? Since we can see a star millions of light years away, no matter where we stand on the ground of this planet. This means that the star can emit at least one photon every millimeter or less millions of light years away so this would mean that it would need an "impossible" density of photons sent at the source on the sun's surface in every direction of the cosmos.

How's this possible?

Best Answer

If you observe a star of radius $R$ from the distance $L$, you will see it as a small disk under the angle $2R/L$ so the solid angle the photons from the star will cover will scale like $(R/L)^2$. That's the percentage of the retina that will be receiving photons: the solid angle measures the "percentage of directions" in which the photons from the star are flying. The number of photons from the star that hit your eyebulb scales like $(R/L)^2$ as well (the dependence $1/L^2$ is what I care about here), because they're divided to all points on the sphere of area $4\pi L^2$, so by dividing these two expressions, you may easily see that the number of photons per unit area of the retina is actually independent of $L$. The star will look smaller as it gets further but the number of photons per unit time that hit a small area of the retina is $L$-independent.

If you're really worried that the star doesn't emit enough photons to satisfy your eyes, note that the Sun emits roughly $4\times 10^{44}$ photons each second. The Earth-Sun distance is roughly 150 million km which is 15 trillion times the radius of the eyebulb. Square it and you will still get that the number of eye-sized areas on the surface of the 1-AU-radius-large sphere is just of order $10^{26}$, still giving you $10^{18}$ photons to each eyebulb per second.

So if you allow 100 photons per eyebulb to be enough to see it, you may still allow the star to be $10^{8}$ times further than the Sun. The sun is 8 light minutes and if multiplied by 100 million, you get something like 200 light years. So with this minimal required number of photons per eyebulb (100 per second), you may see stars up to hundreds of light years away (I can't). Of course, telescopes are collecting starlight from a much larger area than the eyebulb (and they may also patiently collect the photons for a much longer time) so they may see stars much further than that.

Related Question