That page ends with "This is the first efficient algorithm for generating random uniform floating-point numbers that can access the entire range of floating point outputs with correct probabilities". I am very skeptical.
The same basic idea is described here [1] (though without any pretty pictures, but with a link to working code), and I know others have implemented the same insight [2].
This seems like the kind of thing that is reinvented every time some who cares about random numbers learns how IEEE 754 works.
The basic approach: collect top 60 bits of a 64-bit PRNG. (Assume the LSBs are corrupt or zero or nonrandom). Set exponent zero. If the top mantissa bit is zero, shift left, subtract one from exponent. Repeat. When you run out of bits, collect another PRNG from your generator and resume until you have 56 bits of mantissa. When done, your floating-point PRNG is 2^exponent * mantissa.
My short explanation: Suppose you want a random distance. Multiplying a PR integer is the same as selecting a random tile in the distance, and measuring the distance to the (far) edge of the tile. This is the multiply method.
But if you want a real-valued distance even for very near distances, you need to scale your random number and ensure you have random bits throughout the mantissa. So reduce the exponent for every leading zero in your PR integer and shift. Add more bits if needed.
Test: the histogram of exponents for a large-enough set of samples is linear, give or take. Very small floating numbers (less than say 1/32768) are 1/16 as likely as numbers in [0.5,1).
Now that I've actually looked at the "utl::random" code in the OP, I see that its UniformRealDistribution is a wrapper around std::generate_canonical, so the juicy bits about turning a random into into a random float are not exposed here at all. But the utl::random code does include an pointer* to an informative C++ working group note.
I like the idea of looking at the histogram of exponents: incrementing (by 1) the exponent doubles the width of the interval so should double the number of hits. Conversely the histogram of the significand (or any subset of its bits) should be flat.
"Unfortunately for John, the branches made a pact with Satan and quantum mechanics [...] In exchange for their last remaining bits of entropy, the branches cast evil spells on future generations of processors. Those evil spells had names like “scaling-induced voltage leaks” and “increasing levels of waste heat” [...] the branches, those vanquished foes from long ago, would have the last laugh."
> The Mossad is not intimidated by the fact that you
employ https://. If the Mossad wants your data, they’re going to
use a drone to replace your cellphone with a piece of uranium
that’s shaped like a cellphone, and when you die of tumors filled
with tumors, […] they’re going to buy all of your stuff
at your estate sale so that they can directly look at the photos
of your vacation instead of reading your insipid emails about
them.
You know, I didn't really think of this till your comment: this was a vast conspiracy spanning years. You always hear, when discussions of "conspiracies" happen, things like "it would involve too many people, too many moving parts" like opsec would be impossible. And then you have the pagers.
Kinda like the old chestnut that rich people are only rich on paper and then, Musk buys twitter. Not tesla, or some DBA, Musk.
This decade might actually be the season of reveal.
> “Making processors faster is increasingly difficult,” John thought, “but maybe people won’t notice if I give them more processors.” This, of course, was a variant of the notorious Zubotov Gambit, named after the Soviet-era car manufacturer who abandoned its attempts to make its cars not explode, and instead offered customers two Zubotovs for the price of one, under the assumption that having two occasionally combustible items will distract you from the fact that both items are still occasionally combustible.
> Formerly the life of the party, John now resembled the scraggly, one-eyed wizard in a fantasy novel who constantly warns the protagonist about the variety of things that can lead to monocular bescragglement.
And in 2013 the below would have been correct, but we live in a very different world now:
> John’s massive parallelism strategy assumed that lay people use their computers to simulate hurricanes, decode monkey genomes, and otherwise multiply vast, unfathomably dimensioned matrices in a desperate attempt to unlock eigenvectors whose desolate grandeur could only be imagined by Edgar Allen Poe. Of course, lay people do not actually spend their time trying to invert massive hash values while rendering nine copies of the Avatar planet in 1080p.
He wasn't too far off about the monkeys, though...
The bit about vast matrices shows some silver lining though; it turns out John’s little brother figured out how to teach those matrices to talk like a person.