Telescopes capture images in black and white. How do these monochrome images transform into the vividly colored pictures we see, and how true are these colors to reality?

Images captured by telescopes, including the most famous and sophisticated among them, are always obtained in black and white. The captivating, vibrant colorful images published online have undergone a process of colorization. However, this does not imply that their colors aren’t real.

Telescopes capture images using optical sensors called CCD (Charge Coupled Device) sensors. These sensors accurately measure the intensity of incoming light by quantifying the number of photons - individual light particles - that strike them.

The fundamental principles of photography using a telescope are essentially no different from photography by any digital camera. During the photographic process, the sensors are exposed to light for a certain period of time, after which they generate a picture based on a collection of pixels (cells) whose brightness is determined by the number of photons absorbed by each sensor. The main difference between digital cameras, like the one in cell phones, and a telescope camera, is the exposure time used. Cell phones usually expose their light sensors for less than a second, with the longest exposures lasting a few seconds at the most. In contrast, a telescope can observe a single point in the sky for many hours, or even many days.

 

So, why can the sensors in cell phones capture color and telescopes can’t? The answer is that neither of them can. The difference between one color and another is determined by the wavelength of the light, that is, the distance between two peaks in a wave, and CCD sensors cannot distinguish between a red photon and a blue one, for example. In order to create color images in digital cameras, three sensors are placed adjacent to each other, each equipped with a filter that allows only a specific color to pass through it. Filters for red, green and blue (RGB) colors - the primary colors of light - are usually used. By doing so, the amount of blue, red, or green light reaching each pixel in the image can be determined, and the final image is constructed by combining these three primary colors.

  צילום “עמודי הבריאה” בערפילית הנשר בשלושה מסננים שונים. צילום: טלסקופ האבל, NASA

Photographing the “Pillars of Creation” in the Eagle Nebula with three different filters | Photo: NASA, Hubble Space Telescope

 

However, this method has one notable drawback —filtering out white light, which includes all colors, and leaving light of a specific color, necessarily leads to loss of photons, and therefore loss of information. If a blue photon reaches a pixel equipped with a red filter, it does not pass through the filter and remains undetected. In simple home cameras this isn’t very significant, since they typically operate in well-lit environments that are abundant with photons. Telescopes, however, do not have such luxury. Astrophysicists and astronomers often like to say “every photon is precious”. A telescope spends hours collecting light from distant stars and far away galaxies and astronomers can’t afford to waste a single proton. Using three filters on the same pixel significantly reduces the telescope’s sensitivity to faint sources of light.

Reconstructed Colors

So how do telescopes still produce color images? When astronomers want to create colored photos, they must utilize filters for the desired colors and then combine the resultant images. The vibrant images of nebulae and other distant cosmic structures are all the result of combining several black and white photos into one colored image. For instance, the Hubble Space Telescope can capture images using red, green and blue filters and transmit them to Earth. Subsequently, these individual images are combined during post-processing on the ground to create one spectacular composite image.

 

כוכב הלכת שבתאי מצולם בצבעים מייצגים | צילום: טלסקופ האבל, NASA
Planet Saturn, captured in representative colors | Photo: Hubble Space Telescope, NASA

Are these the same colors we would see with our eyes if the human eye could achieve the same level of magnification and exposure as a telescope? The answer is complex. Some of the pictures are indeed displayed in natural colors as far as possible, but for others, this is either impossible or would yield a less captivating picture.

The reason lies in the fact that conventional cameras only capture images in visible light only, since they are intended to be used in order to reflect reality. In contrast, with telescopes, scientists aim to capture the broadest possible spectrum of electromagnetic radiation, of which visible light constitutes only a small part. The Hubble Space Telescope, for example, detects radiation in the range from the infra-red to the ultraviolet spectrum. Accordingly, it is equipped with filters for both ultraviolet and infra-red radiation.

The  challenge is that it is impossible to display an image in ultraviolet or infra-red radiation, since the human eye is unable to perceive these wavelengths. Therefore, when space agencies wish to display images captured at wavelengths beyond the human eye’s detection range, they color these images in representative colors. Some of the most famous space images are those rendered in representative colors, where colors that we are naturally incapable of seeing are represented by colors that we can see.

 

הגלקסיה הספירלית ESO 510-G13 במרחק של כ-150 מיליון שנות אור מכדור הארץ. צילום: טלסקופ האבל, NASA
Photographing in natural colors—the spiral galaxy ESO 510-G13, situated at a distance of approximately 150 light-years from Earth. Photo: NASA, Hubble Space Telescope
 

Telescopes often capture images in colors that are visible to our eyes. However, combining these colors in a “true-to-original” manner would result in a simplistic reddish or yellowish photo. In such cases, it is common practice to artificially enhance the captured wavelengths, and thus create a photo that represents the actual wavelengths but would not be perceivable by the human eye if displayed naturally.

To summarize, colored photos of space are the result of a combination of several black and white photos taken with different filters. Some photos display natural colors, while others present natural colors that have been artificially enhanced. Additionally, some photos utilize representative colors intended to enable us to see types of radiation that we would otherwise be unable to perceive. 

Roi Rahin is currently a PostDoc fellow at the NASA Goddard Space Flight Center, U.S. At the time of writing he was a PhD student in astrophysics at the Technion Institute of Technology, Israel, and held a scholarship from the Ramon Foundation