Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Cool, but neither the article nor the paper (https://arxiv.org/pdf/1703.04416.pdf) mention just how much slower it is.


It's about 1 MPixel / minute on a typical desktop. The other paper mentions that it's extremely slow, but truly we did forget to give actual numbers there.


Worth noting that a great deal of lossy image compression methods currently in use were developed in the mid to late 1990s, when a 90 MHz Pentium was an expensive and high end CPU. Spending CPU time to do the one time compression of a lossless image to lossy is not as expensive in terms of CPU resources as it used to be.


To share my experience, I tried today with a 50mpx image: it took me 1 full hour and it constantly used 10% of my CPU. But the quality was great (even sharper?)!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: