Remote Sensing – Associating Panchromatic Image with High Spatial Resolution

digital image processingimageimageryremote sensing

How and Why is a Panchromatic Satellite Image associated with high spatial resolution? I googled and found that it is a single band image, but then why is it called pan-chromatic(All Colors). Does it mean it covers the entire visible region?

Best Answer

Panchromatic images are created when the imaging sensor is sensitive to a wide range of wavelengths of light, typically spanning a large part of the visible part of the spectrum. Here is the thing, all imaging sensors need a certain minimum amount of light energy before they can detect a difference in brightness. If the sensor is only sensitive (or is only directed) to light from a very specific part of the spectrum, say for example the blue wavelengths, then there is a limited amount of energy available to the sensor compared to a sensor that samples across a wider range of wavelengths. To compensate for this limited energy availability, multi-spectral sensors (the kind that create red, green, blue, near infrared images) will typically sample over a larger spatial extent to get the necessary amount of energy needed to 'fill' the imaging detector. Thus, multispectral band images will typically be of a coarser spatial resolution than a panchromatic image. There is a trade-off that is made between the spectral resolution (i.e. the range of wavelengths that are sampled by an imaging detector) and the spatial resolution. This is why commercial satellites like Ikonos and Geoeye will commonly provide three or more relatively coarse resolution multispecral bands along with a finer spatial resolution panchromatic band. Importantly, there exists a kind of compromise here in which you can combine the fine spatial resolution of a pan image with the high spectral resolution of multi-spectral bands. This is what is known as panchromatic sharpening and it is commonly used to compensate for the spectral/spatial compromise in satellite imaging.

Incidentally, this is also the reason why bands of multi-spectral imagery taken in longer wavelengths, e.g. short-wave infrared, tend to be sampled over much wider ranges of wavelengths compared to the visible bands. The amount of reflected and emitted electromagnetic energy bouncing around out there is uneven and the sun emits a peak around the visible part. Once you get into the short-wave infrared, there is far less energy around to sample compared to shorter-wavelength visible light, so the detectors have to be sensitive to a wider range. If you take a look at Landsat 8, as an example, the SWIR2 band 7 actually samples a wider range of wavelengths than its panchromatic band.

Related Question