It's about 1 MPixel / minute on a typical desktop. The other paper mentions that it's extremely slow, but truly we did forget to give actual numbers there.
Worth noting that a great deal of lossy image compression methods currently in use were developed in the mid to late 1990s, when a 90 MHz Pentium was an expensive and high end CPU. Spending CPU time to do the one time compression of a lossless image to lossy is not as expensive in terms of CPU resources as it used to be.
To share my experience, I tried today with a 50mpx image: it took me 1 full hour and it constantly used 10% of my CPU. But the quality was great (even sharper?)!