What my machine can or can not do is irrelevant, I might connect a drive from a machine where machine cycles were faster or parallel. So it's entirely possible to see timestamps less than X nanoseconds apart even if my machine can't do more than one cycle each X nanoseconds.
Thanks for correcting me on that! I still think it could be useful to people, though. Do you think it's that wasteful? The machines of today are powerful and have plenty of disk space.
That's a good question and it's philosophical. Perhaps it may/may not be wasteful, but it's meaningless therefore useless. If they said "we'll store it to the nanosecond because the kernel does" I'm OK with that if - very big if - they make it clear that it does not have accuracy in the last few digits. They need to say clearly what max accuracy you can expect. To then talk about storing it with finer accuracy after that... just what?
Also be careful about thinking machines having big everything. Caches aren't huge, storing an extra byte if a billion places can add up at scale. Nothing comes free, don't scrimp where you don't have to but neither assume anything's free.