The doctoring for Base.Dates.UTInstant simply says “based on UT time”. Anyone know what version of UT time is being used? Commonly on computers UTC or UT1.
There’s a footnote in the Dates docs that says they’re UT1.
There have been a few long threads comparing various time representations. Here’s one:
I think there have been more…
Let’s say your computer gets its time from a NIST internet time server (not every second – once every so often to assure your internal timeclock is in sync with NIST. Those time servers (except one, and it is unlikely you are using that one) serve UTC time. So the time you see is loosely ganged to UTC. This does not mean your timeclock is setup to understand or represent leap seconds. If it is of a common variety, the loosely UTC time as observed within your computer appears much as if it were syncing with a UT1 timeserver – especially with respect to time deltas. So, as Stephan says, Julia’s time is much closer to UT1 than UTC. For astronomical purposes and for legal what happened first
determinations, the timebase wobble I described must be taken into account.
The answer depends of course on from where the timestamp was obtained. Julia does not make any adjustments when storing a timestamp as a Base.Dates.UTInstant, except for conversions of the time unit and epoch. As @JeffreySarnoff has already pointed out, if the clock that was used to generate a timestamp, had any synchronization with the outside world, it was almost certainly UTC, which is the time standard maintained and distributed by the national standard institutes (NIST in the US).
Google and others label timestamps in email headers etc. “GMT”. Obviously this means “UTC without leap seconds”, and meant is not the original GMT that is unmaintained since about 3 decades. Other documentation of IT systems (including Julia) claim to use UT1 which, however, is not correct. UT1 is internationally maintained and would require having a reasonably accurate UTC and then applying correction tables published by the IERS. Such corrections are not needed in normal use. They can be important if you need to track celestial objects (stars, satellites, …) from the ground with high precision, and other exotic stuff.
So to have well defined timestamps, the clock needs to be synchronized, for example using a network time protocol such as NTP. Then the most correct label for timestamps would be UTC plus a possible time zone offset.
For people unfamiliar with the distinction, this may help clarify the “play” between timebases.
It is a plot of the difference between UT1 and UTC, measured in fractions of a second.
https://upload.wikimedia.org/wikipedia/commons/f/fb/Leapsecond.ut1-utc.svg