Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The vast majority of consumers want their camera to take pictures of people that “look good” to the human eye; the other uses are niche.

But that said, I’m actually surprised that astrophotographers are so interested in calibrating stars to the human eye. The article shows through a number of examples (IR, hydrogen emission line) that the human eye is a very poor instrument for viewing the “true” color of stars. Most astronomical photographs use false colors (check the captions on the NASA archives) to show more than what the eye can see, to great effect.



I suspect its because when conditions are right to actually see color in deep-sky objects, its confounding that it doesn't look the same as the pictures. Especially if seeing the colors with your own eyes feels like a transcendent experience.

I've only experienced dramatic color from deep sky objects a few times (the blue of the Orion Nebula vastly outshines all the other colors, for instance), and its always sort of frustrating that the picture show something so wildly different from what my own eyes see.


There's a good chance the real problem there is limited gamut on the screen, and with the right viewing method the RAW photo could look much much better.


If you get a big enough telescope it will gather enough light to where you'll see things in proper color. I've seen the Orion nebula with a 10 inch reflector in a good location and the rich pinks, blues and reds were impossible to miss. This is the actual photons emitted from that object hitting your retina so it's about as "true color" as you can get.

I think when astrophotographers are trying to render an image it makes sense that they would want the colors to match what your eyes would see looking through a good scope.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: