This is the most incredible picture I have ever seen. In May, the best picture of Pluto was a couple blown up pixels. Now, it looks like a render from a Pixar movie.
This shows a time slider, where you can see the different images, along with the time and distance from Pluto. They have been pretty quick to update it with new images too, so I look forward to the final image(s?) being added.
The "sepia" is actually true color. New Horizon's high resolution camera takes black and white photos, but NASA sometimes releases them supplemented with color information from other sensors.
Argh, I was just preparing to write a rant about this. No, it is absolutely not true-color -- although this is something which NASA insists on claiming, to its detriment. It annoyed me when they used the same technique to show a "true color" view of the surface of Titan[1], and it's annoying me again now.
What this image shows is the average color Pluto. The colour information comes from a much lower-resolution image, which was unable to resolve color differences between regions. In truth, Pluto could be wildly multi-colored -- but these kind of images, tagged with the "true color" label, lead you to think that that Pluto (and Titan) are a boring uniform sepia.
To understand just how wrong it is to label these images as "true color", try the following experiment:
1: Find a nice colorful picture of, say, a rose garden.
2: On a satellite image, find the color-value of that rose garden at, say, a spatial resolution of 50km per pixel.
3: Take the color-value of that pixel -- which, on earth, will range from a muddy greyish-brown to a muddy greyish-blue -- and apply that hue to the entire image. You will now have a very sorry-looking rose-garden.
4: Release the image accompanied by a press release claiming that this is a "true-color" view of said rose-garden.
See the problem with this?
Honestly, the folks at NASA are geniuses, but they really need to stop doing this kind of thing. One thing they don't understand very well is human perception and psychology. If you show people a greyscale image, they will assume that it represents something which is actually more colorful, and their imagination can take it from there. On the other hand, if you show people an image which is colorised with a uniform color, they will assume that is the actual color -- especially if you've then gone on to tell them that it's a true-color image. In truth, it's no more "true-color" than the final image in the rose-garden experiment above. This, unfortunately, convinces people that they're looking at something far more drab than they actually are.
Tl;dr: Where NASA lacks actual color imagery, they really shouldn't colorise pictures with a uniform hue and then call them "true color". It's bordering on false to do so, and results in images which are much more drab than a simple black-and-white image would be, and possible much more drab than reality.
I think there's a case to be made even for "average hue applied to the whole image": it's very easy to look at a black and white planet photo and assume that the whole thing is made of grey rock (since our own familiar Moon is pretty close to exactly that). I made that mistake myself for quite a long time. So using the average hue is one step better than that, or at least no worse.
But my impression has been that they often do something more sophisticated: they have a high resolution greyscale image and a lower resolution color image (maybe just a few pixels, or maybe as much as half or a quarter of the greyscale resolution). Colorizing the higher resolution image using the lower one in that case seems entirely acceptable. (In fact, doesn't JPEG generally do something very similar for color images almost all the time?)
Do I have that wrong? Or is it just that NASA isn't good about reporting the relative resolution of the color component of their "true color" images? (That would be really nice, come to think of it: "This image of Enceladus has detail resolution 150x150 and color resolution 50x50.")
Agreed. A few years back I spent many hours with a telescope trying to find nebula in the sky.
Turns out nebula look like grey clouds, not like psychedelic posters.
I know this stuff makes for some really sexy press, but it doesn't do much help for the general public who has an interest in science but not much time to figure out what's real and what's not real.
By that standard everything you literally see is not "true color". Subsampled chroma is a property of all "color" information because that's the way your eyes actually work. This is a Hobgoblin's argument. Just complain that their chroma sampling is too coarse and put the pitchforks away please.
The problem is that people reasonably assume (and NASA's wording suggests) that the chroma sampling is as high resolution as the greyscale sampling. Hence, it's misleading.
Actually, I thought the black and white were the real colours until somebody pointed it out.
I really don't see the damage in releasing a sepia version of the picture. Why should I care that it's inaccurate? It's a postcard from the other end of the solar system.
The genius which goes into getting these images is astonishing, and I'm not meaning to disparage that at all! The damaging part is when they colorize an image and then call it "true color", when this is simply untrue.
And here's a "true-color" image of Earth, produced using the New Horizons / Huygens colorization technique of combining a greyscale image with very low-resolution color data: http://imgur.com/IOqOTyw.jpg
What's wrong with this? A.) It flat-out isn't "true-color", and NASA shouldn't be spreading inaccurate information, and B.) It's arguably a more boring image than the greyscale original (http://imgur.com/hwYA6VK.jpg), which leaves one better able to imagine possible color schemes, rather than fixating one upon a single, inaccurate, and very boring color scheme.
Thanks you make a distinction and the pics really drive home your point. I guess the problem is the label 'true color' which has stuck for a long time to a refer to statistically determined hues. Maybe we should label 'actual true colors' as 'real color' in order to make the distinction. I suppose true color started out as a way to distinguish them from artistically rendered/painted images.
Well by that standard have we ever seen anything in true color? Could you not make the same argument for our own eyes based on the differences in color down at the atomic level?
Yes, you are. NASA is not picking one color here applying a filter to the entire image as you seem to be implying. They are doing their best with the data available from multiple instruments to determine what the color of each pixel should be. It is a lot more intricate a process than you make out.
There are some bad color filter stories from NASA (such as making Mars surface photos look more like an Earth desert with blue skies than you would actually think if you were there). But this is not a case of that.
Astro-noob here. Why can't we get better pictures from our telescopes, given that we can take what seems to be high quality pictures of things much further away? Is it because of how "small" pluto is, versus say a large galaxy or nebula.
It is exactly that: Pluto is a very tiny astronomical object. Galaxies, nebulae, and such are much further away than Pluto, but that is more than made up for by how mind-boggingly large they are. For example, The Andromeda Galaxy is about six full moons wide as viewed from Earth[1].
It's kind of a shame humanity evolved in one of the spiral arms of the Milky Way.
If we'd evolved in one of the Magellanic Clouds, we'd see an astounding squashed spiral view of this galaxy with direct line of sight of the bright nucleus.
Meanwhile if there are life forms in Magellanic Clouds they are probably looking at the spiral arms thinking that it is a shame that they did not evolve in a galaxy proper...
I'm still amazed we knew about its existence any time prior, the best picture we had in may is just 9 pixels wide. There's more visual information in Google's favicon.
Those represent our previous best information about the surface of Pluto, but they're not individual photos -- they're composite reconstructions from a number of different images taken over a long period of time, as the planet rotated.
> The Hubble images are a few pixels wide. But through a technique called dithering, multiple, slightly offset pictures can be combined through computer-image processing to synthesize a higher-resolution view than could be seen in a single exposure. "This has taken four years and 20 computers operating continuously and simultaneously to accomplish," says Buie, who developed special algorithms to sharpen the Hubble data.
https://twitter.com/tothur/status/620601134651166720/photo/1