Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Gamma error in picture scaling (2007) (4p8.com)
113 points by bpierre on Aug 21, 2014 | hide | past | favorite | 32 comments


I wrote the Lanczos3 scaler that Picasa uses (about 14 years ago!), and it doesn't (by default) correct for gamma.

For most images & kernels, you can get by with doing math in about 31 bits total precision (a bit for underflow clamping), so that's the magic speed improvement from ignoring gamma on old 32-bit architectures.

If you gamma-correct, your sources will need to be 10-12 bpc after transformation, and you'll need >32 bits of integer precision for large kernels. Even though x86 has a 32x32 -> 32bit high word multiply, this all gets a lot better on x64.

So basically: today on 64-bit machines (or wide SIMD) you could do gamma-correct resampling a whole lot cheaper than you could do 10 years ago.

[edit] In the end you notice the difference with line art, but for most photographs you want to spend spend your CPU budget on sharpness (by using wider kernels).


> I wrote the Lanczos3 scaler that Picasa uses (about 14 years ago!),

What are you doing now? Picasa hasn't changed much since Google bought it and ignored it (except for adding G+ integration)


It pops up wanting to download an update about every week for years and years now.


Is it the same update?


Nah, sometimes they ship with a blank or missing dock icon, but IIRC they always include windows .exe files buried within the Mac app bundle ?!

Curiously, even after all those annoying update popups, this photography viewer app still does not support retina displays.


Is someone going to put Emergent Orange down to improper scaling?

http://krazydad.com/blog/2013/12/05/emergent-orange/

(Emergent Orange made to front page here on HN yesterday. However, this gamma article says the movie industry gets gamma scaling right, and KrazyDad's day job is for Disney iirc.)


Anyone who designs algorithms dealing with image manipulation should have at least a basic understanding of colorimetry and color perception. It's absolutely not hard to understand but it is not trivial either. I can only suggest to read [1]. It's a rather thick book but you only need to read the first few chapters to avoid such mistakes.

[1] Wyszecki, Stiles: Color science


That seems a minor complaint about GIMP. It's should be easy to fix after the major changes get done. By that I mean full integration of GEGL and migration to GTK3. IMHO if these don't get done soon, GIMP will get replaced. It's been years since they "inadvertedly ported the core graphics code to GEGL".

You might say my priorities are wrong and GIMP is short on developers and I care about infrastructure more than something like gamma. But I'm moving to Wayland soon, and if GIMP has to run with GTK2 via XWayland I'm just not going to install it. OTOH I'm not a major user of it.


This article is already a few years old, GIMP uses GEGL for a lot more stuff now than it did at the time this was written.


Replaced by what? I'm not aware of any candidates. A VirtualBox with Photoshop, I guess.


Krita [1], probably. They focus on painting nowadays because they don't want to be competing directly with GIMP, but it's a very capable photo editing tool. I use Krita for a lot of my photo stuff because it supports more than 8 bits per channel. Still, I think GIMP has enough developer attention, and GEGL is powerful and compelling enough, that GIMP won't get replaced any time soon.

[1] https://krita.org/


This is a serious pain in anti-aliasing also. The math is not cheap in compositing.


Steve Mann covered this in some mathematical detail his paper called "Comparametric Equations with Practical Applications in Quantigraphic Image Processing", IEEE Transactions on Image Processing, Vol. 9, No. 8, August 2000.

Link: http://www.eyetap.org/papers/docs/comparametric.pdf

The section, starting on PDF page 4, is called "On the Value of Doing the Exact Opposite of What Stockham Advocated", and describes exactly both this problem, and the solution.


Firefox's SVG handling for feDiffuseLighting has the same problem. They calculate the Phong-model in the sRGB space but it should be calculated in a linear colorspace.

Reference rendering and SVG: http://www.w3.org/Graphics/SVG/Test/20061213/htmlEmbedHarnes...

You can see that in Firefox the SVG has much darker colors.


ImageMagick works fine:

  $ convert lama.jpg -gamma .45455 -resize '50%' -gamma 2.2 slama.jpg


That is clearly and directly covered in the article. Also, forcing the user to include manual gamma correction for a format that already has a known gamma curve is not ideal.


> Also, forcing the user to include manual gamma correction for a format that already has a known gamma curve is not ideal.

To this point, ImageMagick has built in understanding of colorspaces:

    convert lama.jpg -colorspace RGB -resize '50%' -colorspace sRGB slama.jpg
In English: Read lama.jpg, convert it to a linear RGB colorspace, rescale to 50%, convert it to sRGB, write to slama.jpg

This is apparently also covered in the article, more as an aside, but it should be the first and foremost way to do it with ImageMagick.


Neat thanks, did not know about the -colorspace option.


Thanks, didn't notice it, was way down near the end. Also I'm glad it does not force it, I have most of my old JPEGs using 0.55 and 1.8 instead unfortunately.


This has been on hacker news before, and that I gave some thoughtful comments on it. Anyone have the link?


A photographer I know wrote a PhotoShop script that tried to correct for this when he resized images.


OS X' Preview.app gets it right.


(2007)


Thanks; added.


This is not really a Gamma correction issue (although gamma shouldn't be ignored for other reasons). It is the result of attenutation of high frequency information close to the Nyquist rate (1 pixel wide details) when downsampling. The applied gamma is adding emphasis to these high frequency details so that they survive the anti-aliasing filter and downsampling process.


No, the gamma compression distorts high-frequency components that have close frequencies f1 and f2 into products such as the low-frequency f1-f2. The reason you don't see this low frequency on your screen is that the inverse gamma curve on your display undoes this distortion exactly. But if you remove the high frequencies f1 and f2 by downsampling before displaying, the only remaining signal is f1-f2, and because of the lack of high frequencies it isn't removed by the inverse gamma curve. So you're left with a visible low frequency signal.

And f1 and f2 don't need to be close to Nyquist, as long as they're significantly attenuated by the downsampling filter.


You didn't understand the article. Read his cannonballs analogy. The problem is the formula used to average colors is wrong in most software. The problem is most software is averaging the numeral values of the RGB component when it it should be averaging the light output intensity, which is not linear to the RGB numeral values.


More specifically any addition of color values generates the problem, not just averaging. If you add two linear values and gamma-correct the result, you get a different answer than if you added the gamma-corrected values directly. Multiplication doesn't have the same problem. I.E. a^x+b^x != (a+b)^x, while a^xb^x == (ab)^x. P.S. it has nothing to do with Nyquist limits.


No, this is a gamma correction issue.

Software which work in full-float linear (Nuke, for example) don't have this issue, as the RGB values are already in luminance-adjusted values as they're linear.

The issue only affects software filtering images in lower-precision (and thus gamma-baked) pixel intensity values - i.e. 0-255 for uchar (8-bit) and 0- for ushort (16-bit), where the R,G and B channels are given equal weighting).


I just added the gamma conversion change to my resampling library. It made a definite improvement in accuracy, especially to astronomy images. Before most details shifted towards being darker, but now they appear to retain their luminosity better.


We have to make compromises in everything. Yeah, it'd be nice to have better scaling algorithms in web browsers, but is it worth the added overhead of processing power and energy usage when it mostly only matters for extremely contrived examples like the one employed here? Yes, people with specific needs won't be able to use the common tool, but that's the very nature of compromise: you give up the little potatoes to satisfy the majority. There is still room for precision, it's just in the craftsman market, not in the consumer market.

He's also being a tad disingenuous. Early in the article, he calls out Photoshop as an offender, but Photoshop has multiple image reduction algorithms that it can use. I haven't used the Gimp in a while, but I'm pretty sure it does, too. Yet it's not until nearly the end of the article that he mentions Photoshop is capable of doing a correct job.

>> "On the other hand, there never was a gamma problem within the printing or the movie industry. They have defined tight and scientifically-grounded standards and procedures and they perceive the public software tools as toys."

This line gave me the impression the article was written by a disgruntled analog-era print or movie industry employee, probably someone who wasn't the best in the world but was employed at the "peak", who is upset about their growing lack of relevance and is foisting the blame on "stupid consumers" and "non professionals". You see the same thing with old line-of-business-application Java programmers. The key is it's not the specific technology that is the problem, it's the relative mediocrity of the writer.

If you argue from a craftsman perspective, but come off as bitter, unemployed greybeard, then you probably weren't ever as good as you seem to think you are. The car didn't kill the horse whip industry, by evidence that one can still buy horse whips. It killed the cheap, commodity horse whip industry.

>> "Knowing that difference, even if you wouldn't notice it, do you accept it?"

Yes, because itty, bitty photos of dragonflies on my cellphone don't mean anything of monetary value to me. For all his grandstanding about professionalism, who is using such tiny, crappy images for work?


Not really going to address your rant, but just wanted to point out that we can and do have better scaling algorithms in web browsers. Works pretty well on safari osx and safari ios. As far as I can tell, it passes his test, which seems to corroborate that OSX added full support to the sRGB standard as of 10.6




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: