[Datetime-SIG] Calendar vs timespan calculations...

Alexander Belopolsky alexander.belopolsky at gmail.com
Sun Aug 2 01:33:43 CEST 2015


On Sat, Aug 1, 2015 at 1:16 AM, Tim Peters <tim.peters at gmail.com> wrote:
>> A careful application will have to call tz.utcoffset() with both values of the
>> flag and either warn about the default choice or ask the user for an
>> additional input.
>
> As above, how can one programatically pick a valid default when faced
> with a missing time?

Suppose, on UTC time u=X, local clocks are advanced d > 0 units.  Then
the function L(u) that maps UTC time u to local time, can be written
as

L(u) = u + o + d * 1[u >= X]

where o is the UTC offset before the transition and 1[] is the
(Knuth?) indicator function.

Let L0(u) = u + o and L1(u) = u + o + d.  My proposal is that when t
is between X + o and X + o + d and therefore t = L(u) has no solution,
we should offer solutions to t = L0(u) and t = L1(t) instead.  (These
solutions are, BTW, t - o and t - o - d.)

With the notation introduced so far, my "extended local-to-global
function"  xG(t, first=True) can be written as

def xG(t, first=True):
    if X + o <= t < X + o + d:
         if first:
            return t - o
         else:
            return t - o - d
    .. # handle other times

Note that xG(t, first=True) > xG(t, first=False) is the deliberate
choice that makes it dead easy to detect that the returned values are
not solutions of t = L(u).


More information about the Datetime-SIG mailing list