Leap seconds are added to UTC to keep it within one second of UT1; the drift from TAI is unbounded.
In TAI, each second is the same length, and each day lasts the same number of seconds; UT1 is (conceptually, approximately) the solar time in Greenwich, so the length of seconds depends on the position of the earth; UTC has seconds of fixed length, like TAI, but occasionally has more or (potentially) fewer seconds in a day, to keep synch with UT1.
If you used TAI, eventually the sun would not be overhead at noon -- and after many millennia, the sun would be up during the "night" and down during the "day" (after around 70 000 years if leap seconds continue to be added at the same rate, but the whole point is that they aren't -- at the moment they seem to be accelerating).
Thank you for explaining the various time standards concisely. I think a lot of people don't know about TAI, but would prefer it to UTC if they did. Leap second handling has been responsible for some pretty serious software bugs. It's so bad that Google doesn't use them internally.
I think it makes the most sense for computers to use TAI internally, and have UTC as a time zone on top of that. Whenever there's a leap second, just push out a TZ update. Most applications don't have special code for time zone updates, which means a lower likelihood of bugs.
Using TAI internally would have another advantage: Computers with GPS receivers could just add 19 seconds to GPS time to get a very accurate TAI. (GPS time is defined to be 19 seconds behind TAI. That was UTC in 1980.)
Using TAI internally would be much better than what we have now. But another problem is that leap seconds are only announced 6 months in advance, which is not enough to update all computing devices (including embedded ones, etc.). See Poul Henning Kamp's article linked elsewhere in this discussion for an idea on how to address that.
Actually the most sensible thing would be to just use UT1 for wall clock time. The deviation of a UT1 second to TAI second (`d/dT UT1(T) - T` where `T = TAI time`) is small enough that your average time signal distribution system (NTP, radio time signal, etc.) can not even resolve it (less than a microsecond). Yes, with GPS time distribution or checking against an atomic clock you can resolve it. But it doesn't really matter (even if you go the Google-way and "smear out" a leap second, the deviation is less than a ms which is below the resolution of most operating systems' scheduler tick length).
At home I hacked a OpenNTP based time server to hand out UT1 instead of UTC. Right now it's based on a simple software patch,
I yet have to a program to fetch the Bulletin B reports and parse them (ftp://hpiers.obspm.fr/iers/bul/bulb_new/bulletinb.dat). I don't know what they're smoking over there, but they're giving UT1 deviation relative to UTC instead of TAI. So first you've to get to UTC from TAI (respecting the leap seconds, gah), just so that you can go to UT1. Why not simply publish the value for UT1 - TAI instead?
Anyway everytime a report gets out, the clock rescaler in the UT1 timeserver gets adjusted apropriately (manually).
In the long run I plan to make this a small embedded project that uses GPS time to stabilize a high precision temperature compensated crystal oscillator^1 (TXCO), maybe I'll add a rubidum clock either, just because. And this is then to act as a stratum 1 time server on my local network.
----
[1] from my work I know that those have way less short term jitter than "naked" rubidium clocks; most precision frequency normals that come in a nice box with buttons and a display are actually TXCOs that are frequency stabilized by a rubidium clock.
> after around 70 000 years if leap seconds continue to be added at the same rate
Apparently it's only 3000 years:
"Leap seconds are not a viable long-term solution because the earth's rotation is not constant: tides and internal friction cause the planet to lose momentum and slow down the rotation, leading to a quadratic difference between earth rotation and atomic time. In the next century we will need a leap second every year, often twice every year; and 2,500 years from now we will need a leap second every month.
On the other hand, if we stop plugging leap seconds into our time scale, noon on the clock will be midnight in the sky some 3,000 years from now"
Defining time based on "sun directly overhead at a particular place at noon" is, itself, a holdover from navigation and seafaring. The time difference between when the sun is directly overhead where you are and directly overhead at a particular place is the latitude difference between the two places.
That's the historical reason why we have a time definition that is locked to the rotation of the Earth. Importantly, anyone navigating based on that principle will get skewed answers if when noon is at a particular place changes.
Interesting; do you have a source for that? I'd have thought that it would be most useful for people on land, too, to have noon at a well-defined time.
It's not a question of moving noon around by hours, but moving noon around by minutes or seconds.
It's not sourced since it's what I remember off of navigational history. Whether solar noon is at 11:45 or 12:15 makes little difference for your personal life, but observing solar noon at 11:45 vs 12:15 is a difference of 1/48th the way around the globe longitudinally.
It sounds like the leap seconds all have the same sign. If corrections are always being applied in the same direction, that implies a second is not as close to 1/86400 of a solar day as it could be.
Maybe we should redefine the second. It is currently defined as "9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom."
9,192,631,770 * 35s / 40y ~= 255. If we adjust the definition by 255 periods, TIA and UTC would stay in sync much better.
The basic problem is that the solar day keeps getting longer. The definition of the second was based on a length-of-day averaged over the 19th century, so now it's too short. We could keep redefining the second every so often, but that sounds even worse than the system we have now. :)
Scientists would hate you if you redefined the second by such a large amount. And since we've delegated the definition of a second to scientists (notably physicists), it's not going to happen.
While all leap seconds so far have been positive, that is not a requirement. Negative leap seconds are a possibility - one accounted for in any valid implementation.
More significant for this particular suggestion, leap seconds do not occur/become necessary on any nice linear basis. They are tragically bound to the rotation of the Earth, which does not care about what would be nice for us and changes in a non-linear fashion.