[Physics] What do the colors in false color images represent

astronomyastrophotographyvisible-light

Every kid who first looks into a telescope is shocked to see that everything's black and white. The pretty colors, like those in this picture of the Sleeping Beauty Galaxy (M64), are missing:

M64

The person running the telescope will explain to them that the color they see in pictures like those isn't real. They're called "false color images", and the colors usually represent light outside the visual portion of the electromagnetic spectrum.

Often you see images where a red color is used for infrared light and purple for ultraviolet. Is this also correct for false color astronomy images? What colors are used for other parts of the spectrum? Is there a standard, or does it vary by the telescope the image was taken from or some other factor?

Best Answer

Part of why you don't see colors in astronomical objects through a telescope is that your eye isn't sensitive to colors when what you are looking at is faint. Your eyes have two types of photoreceptors: rods and cones. Cones detect color, but rods are more sensitive. So, when seeing something faint, you mostly use your rods, and you don't get much color. Try looking at a color photograph in a dimly lit room.

As Geoff Gaherty points out, if the objects were much brighter, you would indeed see them in color.

However, they still wouldn't necessarily be the same colors you see in the images, because most images are indeed false color. What the false color means really depends on the data in question. What wavelengths an image represents depends on what filter was being used (if any) when the image was taken, and the sensitivity of the detector (eg CCD) being used. So, different images of the same object may look very different. For example, compare this image of the Lagoon Nebula (M8) to this one.

Few astronomers use filter sets designed to match the human eye. It is more common for filter sets to be selected based on scientific considerations. General purpose sets of filters in common use do not match the human eye: compare the transmission curves for the Johnson-Cousins UBVRI filters and the SDSS filters the the sensativity of human cone cells. So, a set of images of an object from a given astronomical telescope may have images at several wavelengths, but these will probably not be exactly those that correspond to red, green, and blue to the human eye. Still, the easiest way for humans to visualise this data is to map these images to the red, green, and blue channels in an image, basically pretending that they are.

In addition to simply mapping images through different filters to the RGB channels of an image, more complex approaches are sometimes used. See, for example, this paper (2004PASP..116..133L).

So, ultimately, what the colors you see in a false color image actually mean depends both of what data happened to be used to be make the image and the method of doing the mapping preferred by whoever constructed the image.

Related Question