Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The "sepia" is actually true color. New Horizon's high resolution camera takes black and white photos, but NASA sometimes releases them supplemented with color information from other sensors.


Argh, I was just preparing to write a rant about this. No, it is absolutely not true-color -- although this is something which NASA insists on claiming, to its detriment. It annoyed me when they used the same technique to show a "true color" view of the surface of Titan[1], and it's annoying me again now.

What this image shows is the average color Pluto. The colour information comes from a much lower-resolution image, which was unable to resolve color differences between regions. In truth, Pluto could be wildly multi-colored -- but these kind of images, tagged with the "true color" label, lead you to think that that Pluto (and Titan) are a boring uniform sepia.

To understand just how wrong it is to label these images as "true color", try the following experiment:

1: Find a nice colorful picture of, say, a rose garden.

2: On a satellite image, find the color-value of that rose garden at, say, a spatial resolution of 50km per pixel.

3: Take the color-value of that pixel -- which, on earth, will range from a muddy greyish-brown to a muddy greyish-blue -- and apply that hue to the entire image. You will now have a very sorry-looking rose-garden.

4: Release the image accompanied by a press release claiming that this is a "true-color" view of said rose-garden.

See the problem with this?

Honestly, the folks at NASA are geniuses, but they really need to stop doing this kind of thing. One thing they don't understand very well is human perception and psychology. If you show people a greyscale image, they will assume that it represents something which is actually more colorful, and their imagination can take it from there. On the other hand, if you show people an image which is colorised with a uniform color, they will assume that is the actual color -- especially if you've then gone on to tell them that it's a true-color image. In truth, it's no more "true-color" than the final image in the rose-garden experiment above. This, unfortunately, convinces people that they're looking at something far more drab than they actually are.

Tl;dr: Where NASA lacks actual color imagery, they really shouldn't colorise pictures with a uniform hue and then call them "true color". It's bordering on false to do so, and results in images which are much more drab than a simple black-and-white image would be, and possible much more drab than reality.

https://en.wikipedia.org/wiki/Titan_(moon)#/media/File:Huyge...


I think there's a case to be made even for "average hue applied to the whole image": it's very easy to look at a black and white planet photo and assume that the whole thing is made of grey rock (since our own familiar Moon is pretty close to exactly that). I made that mistake myself for quite a long time. So using the average hue is one step better than that, or at least no worse.

But my impression has been that they often do something more sophisticated: they have a high resolution greyscale image and a lower resolution color image (maybe just a few pixels, or maybe as much as half or a quarter of the greyscale resolution). Colorizing the higher resolution image using the lower one in that case seems entirely acceptable. (In fact, doesn't JPEG generally do something very similar for color images almost all the time?)

Do I have that wrong? Or is it just that NASA isn't good about reporting the relative resolution of the color component of their "true color" images? (That would be really nice, come to think of it: "This image of Enceladus has detail resolution 150x150 and color resolution 50x50.")


Agreed. A few years back I spent many hours with a telescope trying to find nebula in the sky.

Turns out nebula look like grey clouds, not like psychedelic posters.

I know this stuff makes for some really sexy press, but it doesn't do much help for the general public who has an interest in science but not much time to figure out what's real and what's not real.


The psychedelic colours are from the different light forms that are represented, such as UV/infra-red and others.


Yes.

For those readers who are interested, they assign the primary colors to various important parts of the spectrum representing various elements.

Watched a wonderful set of lectures from The Great Courses that explains this. Highly recommended for laymen who are interested in astronomy.

http://www.thegreatcourses.com/courses/understanding-the-uni...


By that standard everything you literally see is not "true color". Subsampled chroma is a property of all "color" information because that's the way your eyes actually work. This is a Hobgoblin's argument. Just complain that their chroma sampling is too coarse and put the pitchforks away please.


The problem is that people reasonably assume (and NASA's wording suggests) that the chroma sampling is as high resolution as the greyscale sampling. Hence, it's misleading.


Actually, I thought the black and white were the real colours until somebody pointed it out.

I really don't see the damage in releasing a sepia version of the picture. Why should I care that it's inaccurate? It's a postcard from the other end of the solar system.


The genius which goes into getting these images is astonishing, and I'm not meaning to disparage that at all! The damaging part is when they colorize an image and then call it "true color", when this is simply untrue.

As an example, here's an actual true-color image of the earth: http://imgur.com/EEt635V.jpg

And here's a "true-color" image of Earth, produced using the New Horizons / Huygens colorization technique of combining a greyscale image with very low-resolution color data: http://imgur.com/IOqOTyw.jpg

What's wrong with this? A.) It flat-out isn't "true-color", and NASA shouldn't be spreading inaccurate information, and B.) It's arguably a more boring image than the greyscale original (http://imgur.com/hwYA6VK.jpg), which leaves one better able to imagine possible color schemes, rather than fixating one upon a single, inaccurate, and very boring color scheme.


Thanks you make a distinction and the pics really drive home your point. I guess the problem is the label 'true color' which has stuck for a long time to a refer to statistically determined hues. Maybe we should label 'actual true colors' as 'real color' in order to make the distinction. I suppose true color started out as a way to distinguish them from artistically rendered/painted images.


If you don't care about accuracy why take a photo at all? Just draw something, it'll probably be more interesting.


What is the source of the full color Jupiter images from 2007 then?

http://www.theatlantic.com/technology/archive/2015/07/the-ca...


Well by that standard have we ever seen anything in true color? Could you not make the same argument for our own eyes based on the differences in color down at the atomic level?


Also, I'm kind of amazed that this is attracting loads of downvotes. Do people think I'm trolling?


Yes, you are. NASA is not picking one color here applying a filter to the entire image as you seem to be implying. They are doing their best with the data available from multiple instruments to determine what the color of each pixel should be. It is a lot more intricate a process than you make out.

There are some bad color filter stories from NASA (such as making Mars surface photos look more like an Earth desert with blue skies than you would actually think if you were there). But this is not a case of that.


Maybe they're just using the average value of true?


Thanks. It turns out the coloured pictures are taken by Ralph [1]. One of the three cameras on New Horizon.

[1] http://www.theatlantic.com/technology/archive/2015/07/the-ca...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: