[Python-ideas] High Precision datetime
mertz at gnosis.cx
Thu May 10 15:15:03 EDT 2018
This feels specialized enough to belong in a third party library. If that
library can behave as transparently as possible interacting with Python
datetime, so much the better. But the need is niche enough I don't think it
belongs in standard library.
... this as someone who actually worked in a lab that measured MD
simulations in attoseconds. I do understand the purpose.
On Thu, May 10, 2018, 2:00 PM Ed Page <ed.page at ni.com> wrote:
> Is there interest in a PEP for extending time, datetime / timedelta for
> arbitrary or extended precision fractional seconds?
> My company designs and manufactures scientific hardware that typically
> operate with nanoseconds -- sometimes even attoseconds -- levels of
> precision. We’re in the process of providing Python APIs for some of these
> products and need to expose the full accuracy of the data to our
> customers. Doing so would allow developers to do things like timestamp
> analog measurements for correlating with other events in their system, or
> precisely schedule a future time event for correctly interoperating with
> other high-speed devices.
> The API we’ve been toying with is adding two new fields to time, datetime
> and timedelta
> - frac_seconds (int)
> - frac_seconds_exponent (int or new SITimeUnit enum)
> time.microseconds would be turned into a property that wraps frac_seconds
> for compatibility
> - Defining the new `max` or `resolution`
> - strftime / strptime. I propose that we do nothing, just leave
> formatting / parsing to use `microseconds` at best. On the other hand,
> __str__ could just specify the fractional seconds using scientific or
> engineering notation.
> - My company create our own datetime library
> - Continued fracturing of time ... ecosystem (datetime, arrow, pendulum,
> delorean, datetime64, pandas.Timestamp – all of which offer varying degrees
> of compatibility)
> - Add an `attosecond` field and have `microsecond` wrap this.
> - Effectively same except hard code `frac_seconds_exponent` to lowest
> - The most common cases (milliseconds, microseconds) will always pay the
> cost of using a bigint as compared to the proposal which is a "pay for what
> you use" approach
> - How do we define what is "good enough" precision?
> - Continue to subdivide time by adding `nanosecond` that is "nanoseconds
> since last micosecond", `picosecond` that is "picoseconds since last
> micnanosecond", and `attosecond` field that is "attoseconds since last
> - Possibly surprising API; people might expect `picosecond` to be an
> offset since last second
> - Messy base 10 / base 2 conversions
> - Have `frac_seconds` be a float
> - This has precision issues.
> If anyone wants to have an impromptu BoF on the subject, I'm available at
> Ed Page
> Python-ideas mailing list
> Python-ideas at python.org
> Code of Conduct: http://python.org/psf/codeofconduct/
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Python-ideas