[Physics] point sources vs extended source irradiance and radiometry question

opticsradiometry

How can the irradiance of a point source behave proportionally to 1/R^2, where R is the range of the source yet the irradiance of an extended source be independent of range? It doesn't make any sense to me.

Consider two formulas I found online looking for irradiance of a point source vs extended source:

point source: Irradiance = dpsi/dA = I / R^2, where I is the intensity (W/sr) dpsi is the power and dA is an area element.

extended source: Irradiance = dpsi/dA = pi/4 L (d/f)^2 where d is the diameter of a lens, f its focal length, and L the radiance of the object in W/m^2 sr.

Take an object of uniform radiance (spatially) and approximate it with a point source. As the object gets closer the estimate based on the point source approximation increases quickly where the irradiance calculated from the extended source doesn't change at all. Now, I know the point source is a simplification, but looking at these formulas it sure looks like a bad one!!! What am I missing? Because I know it is used a lot and looking at it now, I'm not sure why.

I have a heavy mathematical background but very little physics. So, go ahead and lay it on me with the math in your answer but give context to the physics.

Best Answer

If a source cannot fit within the angular view of a telescope or other lens, then it doesn't matter how far away you are from it as long as it has uniform radiance.

If you move further from that object, you can see more of it, and this value is proportional to $R^2$. This balanced with the inverse relationship of intensity with distance.

Related Question