[Datetime-SIG] Datetime - my issues
Skip Montanaro
skip.montanaro at gmail.com
Wed Jul 29 20:41:48 CEST 2015
On Wed, Jul 29, 2015 at 12:49 PM, Alexander Belopolsky
<alexander.belopolsky at gmail.com> wrote:
> There is currently no straightforward way to start with a naive
> datetime(2014, 11, 3, 12, 0)
> and produce
> "2014-11-03T12:00-0500 EST"
I'm almost certainly not answering the question you're inferring (why
is there no straightforward way ...), so I've changed the subject. I
will describe my modest experience in this area. They are really only
as a user of datetime and pytz. I've made almost no attempt to expand
on the existing APIs, just one or two helper functions. (Caveat: I
still live in a Python 2.x world.)
I perform naive-to-aware conversions routinely in my code (also in the
financial services space, where we need to deal with exchange times in
North American, European, Australian and Asian timezones). Every once
in a blue moon I need a naive timezone (possibly normalized to a
different time zone first) from a tz-aware one, but that is a rare
occurrence. I think as time goes on we will (or at least should) see
less and less reliance on naive datetimes.
Fortunately, I and my systems are generally fast asleep at those
ambiguous times Tim and Lennart were discussing. For historical and
interface reasons, the system I work with uses a mixture of epoch
seconds, naive datetime objects, and tz-aware datetime objects,
depending on the original source of the time data. Going forward, I
hope to reduce the reliance of the sytem on first two time
representations, and increase use of tz-aware datetime objects. In
the past year I've had the same sort of epiphany regarding timezones
as I had when I started drinking the Unicode kool aid 15 or so years
ago. (And then I went to work for a company which still thinks the
entire world runs on ASCII and has a bunch of C++ code to prove it...)
Despite having to convert naive (and implicitly Chicago-based)
datetime objects into tz-aware objects, I still find myself looking
back at existing code to perform the conversions, as (for whatever
reason) the method names aren't terribly suggestive to me, and I
sometimes get the names wrong. The platform I use provides some stock
objects:
>>> LOCAL_TZ
<DstTzInfo 'America/Chicago' CST-1 day, 18:00:00 STD>
>>> EPOCH
datetime.datetime(1969, 12, 31, 18, 0)
Despite not having a LOCAL_EPOCH object by default, it's easy enough
to create:
>>> LOCAL_EPOCH = LOCAL_TZ.localize(EPOCH)
>>> LOCAL_EPOCH
datetime.datetime(1969, 12, 31, 18, 0, tzinfo=<DstTzInfo
'America/Chicago' CST-1 day, 18:00:00 STD>)
Still, when it comes time for me to use the LOCAL_TZ object, I find
the method names "localize" (generate a tz-aware datetime object from
a naive one) and "normalize" (convert between timezones) unintuitive.
Getting the string you want (or very close to it, why would you want
HHMM offset and timezone name?) is similarly straightforward, using a
tz-aware datetime object:
>>> now = Clock.local_datetime()
>>> now
datetime.datetime(2015, 7, 29, 13, 15, 23, 650153, tzinfo=<DstTzInfo
'America/Chicago' CDT-1 day, 19:00:00 DST>)
>>> now.isoformat()
'2015-07-29T13:15:23.650153-05:00'
I do find the slightly slavish adherence to the old Unix time(3)
function implication that "times don't have fractions of a second" a
bit clumsy:
>>> now
datetime.datetime(2015, 7, 29, 13, 15, 23, 650153, tzinfo=<DstTzInfo
'America/Chicago' CDT-1 day, 19:00:00 DST>)
>>> now.timetuple()
time.struct_time(tm_year=2015, tm_mon=7, tm_mday=29, tm_hour=13,
tm_min=15, tm_sec=23, tm_wday=2, tm_yday=210, tm_isdst=1)
What happened to my microseconds??? That forces me to write little
helper functions like this:
def to_timestamp(dt):
"""Convert a datetime object into seconds since the Unix epoch."""
return time.mktime(dt.timetuple()) + dt.microsecond/1e6
Going forward, I think it would be cleaner if datetime.timetuple()
returned a tuple with microseconds, and the relevant old-style
functions in the time module were smart enough to know when to ignore
them.
I also occasionally want timedelta objects in seconds, so have this:
def as_seconds(delta):
"""Convert a timedelta object into seconds (float)."""
return delta.days*24*60*60 + delta.seconds + delta.microseconds/1e6
Despite the overflow possibilities (I doubt I will still be working
when that happens), I think it would be nice to have a few convenience
functions like this, in the to-be-named-later datetime utils
module. I'm less thrilled with some heuristic function which allows
you to take a datetime object and add or subtract some number of
months, as I can't see a good way to get that right.
Finally, at some point I read somewhere that using
dtobject.astimezone(newzone) was preferable to using
zone.normalize(dtobject). My code is now littered with both spellings,
but I have no recollection of why one would be preferred over the
other (except that pytz is clearly not part of the standard library).
Corollary: It seems to me like the astimezone call ought to work:
>>> now = datetime.datetime.now()
autoloading datetime
>>> import pytz
>>> chi = pytz.timezone("America/Chicago")
>>> now.astimezone(chi)
ValueError astimezone() cannot be applied to a naive datetime
[<stdin>|<module>|1]
>>> chi.localize(now)
datetime.datetime(2015, 7, 29, 13, 39, 23, 139107, tzinfo=<DstTzInfo
'America/Chicago' CDT-1 day, 19:00:00 DST>)
but maybe that's just a case of explicit-is-better-than-implicit...
Skip
More information about the Datetime-SIG
mailing list