Hey Science Girl, any comments on this story? Sure, my comment is: don't believe it - it was retracted, chiefly because the rocks in the photos are actually on the inside of a crater on a 20-30 degree slope. Scientists can make mistakes. But it does bring up a good opportunity to talk about false-color images!
Many of the rover pictures, and other planetary photos, appear in "false color." We're used to looking with our eyes and seeing all the colors at once. But you've done the experiment where you pass light through a prism and make a rainbow, right? That shows you that the light we see, whether in a source like the sun or reflected off objects, is made of light of many wavelengths. Something that looks green to your eye is reflecting a lot of 510-nanometer light back to you (green), and not so much 650-nanometer light (red). The MER camera - Pancam - only sees one color at a time. Basically, it has filters that only let a specific wavelength of light through, so it takes a grayscale photo showing the intensity of the scene in one color - or filter - at a time. The Pancam has 13 different filters, which means it can see the scene in 13 different wavelengths. Some of the wavelengths are familiar to us, like blue and red, and others are beyond human vision, like infrared.
When we humans get the data, each photo is a grayscale image taken at a different wavelength. If we had an infinite number of filters that covered the whole visible spectrum, we could combine them into what our eyes would see. But instead we have to be a little more clever and combine the filters in combinations to approximate what our eyes would see. These are called "approximate true color" or "true color" images. When you look at Mars in true color, you see that it's pretty much red. The rocks are red, the soil is red, the dust is red, even blueberries are red. It's pretty hard to make out differences in true color. So, we get tricky and make "false-color" images, where we combine filters in ways the eye would never see in order to bring out differences among rocks and soils and features. For example, many geologic features are distinct in the Pancam filters L2, L5 and L7, which correspond to wavelengths of 753, 535, and 432 nm, or infrared, greenish-yellow, and indigo. These get combined so L2 represents red, L5 represents green, and L7 represents blue - meaning that much of the visible red wavelengths that dominate all of Mars are missing in this representation and the blues become more prominent.
Now let's look at the photo in the article again, knowing now it is in false color (L257 actually). The blue in this photo means the material reflects more 432 nm light than 535 and 753 nm light, making it appear blue. We can't tell from this image whether our eye would actually see it as some other color. In fact, when we look at the true-color images of the area where this picture comes from (Burns Cliff), we can see that in fact, the "blue" stuff in the cracks reflects a LOT of red light, appearing reddish brown when we're able to collect images using the red filter. We can also see the slope of the area, making it impossible for water to pool.
False-color images are really useful for a couple of reasons. They allow us to take less data and still have a reasonable sampling of the target in wavelenghts that span our available filters. They allow us to discriminate more readily amonf the reddish rocks of Mars, including the famous blueberries, which are actually grayish-red hematite. And, when combined in less intuitive ways, they can make some really spectacular and colorful and amazing views of another planet.
Tuesday, June 12, 2007
Subscribe to:
Post Comments (Atom)
1 comment:
Hey Science Girl,
Thanks, I didn't believe it for a minute!
Post a Comment