[Datetime-SIG] Calendar vs timespan calculations...

Tim Peters tim.peters at gmail.com
Mon Aug 3 22:50:07 CEST 2015

>> ...
>> I like the idea of using a special tzinfo to reveal the leap seconds for
>> those who really want them. (And we won't have to provide such a tzinfo --
>> it's enough that one could be written, given a table of leap seconds.)

[Łukasz Rekucki]
> But if we don't write a one, how do we know it's possible to do?

Because it's a shallow problem:  not difficult, just tedious.  We
_are_ proposing that tzstrict implement timeline arithmetic across DST
transitions.  Timeline arithmetic will work correctly for that if and
only if it will also work correctly for leap seconds.  They're exactly
the same problem, it's just that one works naturally in units of
minutes while the other works in units of seconds.  None of the code
cares about that, since - whatever it _looks_ like from the outside -
all arithmetic in datetime is "really" working with microseconds.

> IMHO, the same approach was taken for tzinfo and DST and it didn't
> work out very well.

Very different.  It was known from the start - and documented from the
start - that datetime supplied no way to distinguish between the
ambiguous times at the end of DST.  It wasn't anticipated that some
people would care so much about this non-problem ;-) that they'd make
heroic efforts to supply hacks to work around it.  That the hacks
_are_ so obviously hacks isn't their fault, because they're trying to
accomplish something the design never intended to support.
Alexander's "first" flag supplies the one bit of support that was
missing from the start, and it's equally applicable to ambiguities due
to leap seconds as to those due to DST transitions.

> For example, I probably could implement a timezone which represents
> TAI (assuming I have always up-to-date leap seconds table), but it is
> not possible because tzinfo.utcoffset() requires me to return an
> integer number of minutes.

Yup.  There's nothing in the code that really gives a rip about
utcoffset() returns - any number of microseconds would work as well.
Restricting it to a multiple of minutes with magnitude strictly less
than 60*24 was simply intended to be an aid in catching programming
errors.  Relaxing that restriction is backward-compatible (except for
code that's deliberately trying to provoke this exception).

> Most times you need leap seconds, is to ignore them (as Tim described
> in his trading stock example). Adding support to a single programming
> language and fixing it every application, won't make the world better.
> Instead, you can use UTC-SLS or do what Google did.
> [1]: http://www.cl.cam.ac.uk/~mgk25/time/utc-sls/

The people who clamor for leap seconds are of two kinds:

1. People who want "purity" because that's what they always want, and
regardless of whether it makes any practical difference to them.

2. People who legitimately care about exactly how many SI seconds
separate two datetimes.  They ought to consider using TAI instead
(where this is trivial to determine via code that's direct and
obviously correct), but in principle it's possible to get that too
from UTC datetimes.(although, if this were my concern, I'd always
worry about whether a transformation so complex was correctly
implemented by the library I was using!  The more critical the
application, the greater the value of "obviously correct" over "not
obviously broken").

More information about the Datetime-SIG mailing list