[Physics] Why isn’t the calculation that we should be able to see the sun well beyond the observable universe valid

electromagnetic-radiationopticsphotonsvisible-light

I recently read an interesting article that states that a human being can perceive a flash of as few as 5 or so photons, and the human eye itself can perceive even a single photon. The brain will filter this out, however.

I wanted to calculate how far away you'd have to be standing from our sun for not a single one of its photons to be hitting your pupil over a given second.

The first thing I did was assume that the sun emits $10^{45}$ photons per second, because, well, that's the only number I could find through internet research.

The next step is to assume that the average angle between photons emitted from the sun is pretty much the same, and is equal to $3.6 × 10^{-43}$ degrees.

The next step is to assume that the average human pupil diameter is 0.005 meters, and then draw a triangle like so:

cruddy drawing

The length of the white line through the center of the triangle equals the distance at which two photons from the sun would be further apart than your pupil is wide, meaning not even one photon should hit your eye.

I broke the triangle into two pieces and solved for the white line by using the law of sines, and my final result is ridiculous.

$3.97887×10^{41} $ meters is the length of the white line. For reference, that's over $10^{14}$ times the diameter of the observable universe.

My conclusion says that no matter how far you get from the sun within our observable universe, not only should some of the photons be hitting your pupil, but it should be more than enough for you to visually perceive.

But if I was right, I'd probably see a lot more stars from very far away every night when I looked up at the sky. Why is my calculation inconsistent with what I see?

Best Answer

The problem with your derivation is that you distributed the photons over a 360° circle, so the photons only spread out in a two-dimensional circle. This means that the intensity of light drops off at a rate proportional to $1/r$ instead of $1/r^2$ (where $r$ is the distance from the center of the sun) like it does in a three-dimensional universe.

So, starting with $N$ photons emitted per second, the intensity of photons at a distance $r$ from the sun is given by $$I = \frac{N}{4\pi r^2}.$$ This comes from spreading out the photons over the surface of a sphere surrounding the sun.

The number of photons seen by your eye per second is just the intensity multiplied by the area of the iris of your eye: $$n = IA_\text{eye} = \frac{N}{4\pi r^2}A_\text{eye}.$$ You are looking for the distance beyond which you would see less than one photon per second: $$n = \frac{N}{4\pi r^2}A_\text{eye} \lt 1$$ Solving for $r$ gives $$r > \sqrt\frac{NA_\text{eye}}{4\pi}$$ Plugging in your numbers gives $$r > \sqrt{\frac{(10^{45})\pi(0.005\,\textrm{m}/2)^2}{4\pi}} = 4\cdot10^{19} \,\textrm{m} \approx 4000\,\textrm{light-years}$$ This distance is still well within our own galaxy.