The blog makes it sound like that's the target but the paper has this line:
"Our results are only valid for high-bitrate compression, which is useful for
long-term photo storage."
Do the author's think the size/quality benefits still show up when targetting lower bitrates/qualities that are more common on the web? Do they intend to try to prove it?
Quality-wise, Guetzli is applicable to about 50% of the JPEG images in the internet. The other half is stored with lower than 85 quality, and Guetzli declines to attempt to compress to that quality.
Another limitation is that Guetzli runs very slowly. This gives a further limiting axis: Guetzli in its current form cannot be applied to a huge corpus of images. Perhaps this covers half of the images on the internet.
So, let's say that Guetzli is 25% relevant to the web pages.
The blog makes it sound like that's the target but the paper has this line:
"Our results are only valid for high-bitrate compression, which is useful for long-term photo storage."
Do the author's think the size/quality benefits still show up when targetting lower bitrates/qualities that are more common on the web? Do they intend to try to prove it?