I have to agree with David that this seems too specialized to make room for
in the stdlib.
On Thu, May 10, 2018, 15:16 David Mertz
This feels specialized enough to belong in a third party library. If that library can behave as transparently as possible interacting with Python datetime, so much the better. But the need is niche enough I don't think it belongs in standard library.
... this as someone who actually worked in a lab that measured MD simulations in attoseconds. I do understand the purpose.
On Thu, May 10, 2018, 2:00 PM Ed Page
wrote: Greetings,
Is there interest in a PEP for extending time, datetime / timedelta for arbitrary or extended precision fractional seconds?
My company designs and manufactures scientific hardware that typically operate with nanoseconds -- sometimes even attoseconds -- levels of precision. We’re in the process of providing Python APIs for some of these products and need to expose the full accuracy of the data to our customers. Doing so would allow developers to do things like timestamp analog measurements for correlating with other events in their system, or precisely schedule a future time event for correctly interoperating with other high-speed devices.
The API we’ve been toying with is adding two new fields to time, datetime and timedelta - frac_seconds (int) - frac_seconds_exponent (int or new SITimeUnit enum)
time.microseconds would be turned into a property that wraps frac_seconds for compatibility
Challenges - Defining the new `max` or `resolution` - strftime / strptime. I propose that we do nothing, just leave formatting / parsing to use `microseconds` at best. On the other hand, __str__ could just specify the fractional seconds using scientific or engineering notation.
Alternatives - My company create our own datetime library - Continued fracturing of time ... ecosystem (datetime, arrow, pendulum, delorean, datetime64, pandas.Timestamp – all of which offer varying degrees of compatibility) - Add an `attosecond` field and have `microsecond` wrap this. - Effectively same except hard code `frac_seconds_exponent` to lowest value - The most common cases (milliseconds, microseconds) will always pay the cost of using a bigint as compared to the proposal which is a "pay for what you use" approach - How do we define what is "good enough" precision? - Continue to subdivide time by adding `nanosecond` that is "nanoseconds since last micosecond", `picosecond` that is "picoseconds since last micnanosecond", and `attosecond` field that is "attoseconds since last picosecond" - Possibly surprising API; people might expect `picosecond` to be an offset since last second - Messy base 10 / base 2 conversions - Have `frac_seconds` be a float - This has precision issues.
If anyone wants to have an impromptu BoF on the subject, I'm available at PyCon.
Thanks Ed Page _______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
_______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/