High Precision datetime

Greetings, Is there interest in a PEP for extending time, datetime / timedelta for arbitrary or extended precision fractional seconds? My company designs and manufactures scientific hardware that typically operate with nanoseconds -- sometimes even attoseconds -- levels of precision. We’re in the process of providing Python APIs for some of these products and need to expose the full accuracy of the data to our customers. Doing so would allow developers to do things like timestamp analog measurements for correlating with other events in their system, or precisely schedule a future time event for correctly interoperating with other high-speed devices. The API we’ve been toying with is adding two new fields to time, datetime and timedelta - frac_seconds (int) - frac_seconds_exponent (int or new SITimeUnit enum) time.microseconds would be turned into a property that wraps frac_seconds for compatibility Challenges - Defining the new `max` or `resolution` - strftime / strptime. I propose that we do nothing, just leave formatting / parsing to use `microseconds` at best. On the other hand, __str__ could just specify the fractional seconds using scientific or engineering notation. Alternatives - My company create our own datetime library - Continued fracturing of time ... ecosystem (datetime, arrow, pendulum, delorean, datetime64, pandas.Timestamp – all of which offer varying degrees of compatibility) - Add an `attosecond` field and have `microsecond` wrap this. - Effectively same except hard code `frac_seconds_exponent` to lowest value - The most common cases (milliseconds, microseconds) will always pay the cost of using a bigint as compared to the proposal which is a "pay for what you use" approach - How do we define what is "good enough" precision? - Continue to subdivide time by adding `nanosecond` that is "nanoseconds since last micosecond", `picosecond` that is "picoseconds since last micnanosecond", and `attosecond` field that is "attoseconds since last picosecond" - Possibly surprising API; people might expect `picosecond` to be an offset since last second - Messy base 10 / base 2 conversions - Have `frac_seconds` be a float - This has precision issues. If anyone wants to have an impromptu BoF on the subject, I'm available at PyCon. Thanks Ed Page

This feels specialized enough to belong in a third party library. If that library can behave as transparently as possible interacting with Python datetime, so much the better. But the need is niche enough I don't think it belongs in standard library. ... this as someone who actually worked in a lab that measured MD simulations in attoseconds. I do understand the purpose. On Thu, May 10, 2018, 2:00 PM Ed Page <ed.page@ni.com> wrote:

Is there interest in a PEP for extending time, datetime / timedelta for arbitrary or extended precision fractional seconds?
Having seen the utter disaster that similar ideas brought to numpy, I would say: no. On the other hand, nanoseconds are slowly making their way to the stdlib and to add nanoseconds to datetime we only need a fully backward compatible implementation, not even a PEP. See <https://bugs.python.org/issue15443>.

On Thu, May 10, 2018 at 6:13 PM, Alexander Belopolsky < alexander.belopolsky@gmail.com> wrote:
I'm not sure the "disaster" was due to this idea.... nor, frankly, is datetime64 a disaster at all, though certainly far from perfect. But my question is whether high precision timedeltas belongs with "calendar time" at all. What with UTC and leap seconds, and all that, it gets pretty ugly, when down to the second or sub-second, what a given datetime really means. If I were to work with high precision measurements, experiments, etc, I'd use a "nanoseconds since" representation, where the "epoch" would likely be the beginning of the experiment, of something relevant. Note that this issued in netcdf CF formats, datetimes are expressed in things like: "hours since 1970-01-01:00:00" granted, it's mostly so that the values can be stored as an array of a simple scalars, but it does allow precision and an epoch that are suited to the data at hand. NOTE: One source of the "disaster" of numpy's datetime64 is you can set teh precision, but NOT the epoch -- which is kind of problematic if you really want femtosecond precision for something not in 1970 :-) -CHB
-- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

On Tue, May 15, 2018 at 2:05 AM, Chris Barker via Python-ideas <python-ideas@python.org> wrote:
UTC and leap seconds aren't a problem. When there's a leap second, you have 23:59:60 (or you repeat 23:59:59, if you can't handle second #60). That's pretty straight-forward, perfectly well-defined. No, the REAL problems come from relativity.....
That's an unrelated form of time calculation. For that kind of thing, you probably want to ignore calendars and use some form of monotonic time; but also, if you want to go to (or below) nanosecond resolution, you'll need your clock to actually be that accurate, which most likely means you're not using a computer's clock. Femtosecond timestamping would basically be just taking numbers given to you by an external device and using them as sequence points - clocks and calendars become irrelevant. The numbers might as well be frame numbers in a super-high-speed filming of the event. ChrisA

UTC and leap seconds aren't a problem.
Of course they are a problem— why else would they not be implemented in datetime? But my point if that given datetimestamp or calculation could be off by a second or so depending on whether and how leap seconds are implemented. It just doesn’t seem like a good idea to be handling months and femptoseconds with the same “encoding” -CHB

Chris is certainly right. A program that deals with femtosecond intervals should almost surely start by defining a "start of experiment" epoch where microseconds are fine. Then within that epoch, events should be monotonic integers for when measured or calculated times are marked. I can easily see reasons why a specialized wrapped int for FemtosecondsFromStart could be useful. But that's still a specialized need for a third party library. One possible use of this class might be to interoperate with datetimes or timedeltas. Conceivably sick interoperability could be dealing with leap seconds when needed. But "experiment time" should be a simple monotonic and uniform counter. On Mon, May 14, 2018, 6:35 PM Chris Barker - NOAA Federal via Python-ideas < python-ideas@python.org> wrote:

On Mon, 14 May 2018 at 12:17 Chris Angelico <rosuav@gmail.com> wrote:
I'm sure that the issue of "what do you call the leap second itself" is not the problem that Chris Barker is referring to. The problem with leap seconds is that they create unpredictable differences between UTC and real elapsed time. You can represent a timedelta of exactly 10^8 seconds, but if you add it to the current time, what should you get? What UTC time will it be in 10^8 real-time seconds? You don't know, and neither does anybody else, because you don't know how many leap seconds will occur in that time. The ways to resolve this problem are: (1) fudge the definition of "exactly 10^8 seconds" to disregard any leap seconds that occur in that time interval in the real world, making it not so exact anymore (2) use TAI instead of UTC, as GPS systems do (3) leave the relationship between time deltas and calendar time undefined, as some in this thread are suggesting

On Tue, May 15, 2018 at 11:21 AM, Rob Speer <rspeer@luminoso.com> wrote:
indeed -- even if you only care about the past, where you *could* know the leap seconds -- they are, by their very nature, of second precision -- which means right before leap second occurs, your "time" could be off by up to a second (or a half second?) It's kind of like using a carpenter's tape measure to to locate points from a electron microscope scan :-) The other issue with leap-seconds is that python's datetime doesn't support them :-) And neither do most date-time libraries. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

On Thu, May 17, 2018 at 12:56 PM Chris Barker via Python-ideas < python-ideas@python.org> wrote:
The other issue with leap-seconds is that python's datetime doesn't support them :-)
That's not entirely true. Since the implementation of PEP 495, it is possible to represent the 23:59:60 as 23:59:59 with the "fold" bit set. Of course, the repeated 23:59:59 will be displayed and behave exactly the same as the first 23:59:59, but a 3rd party library can be written to take the "fold" bit into account in temporal operations.

On Thu, May 17, 2018 at 10:14 AM, Alexander Belopolsky < alexander.belopolsky@gmail.com> wrote:
Does that support the other way -- or do we never lose a leap second anyway? (showing ignorance here) But still, now datetime *could* support leap seconds (which is nice, because before, 23:59:60 was illegal, so it couldn't even be done at all), but that doesn't mean that it DOES support leap seconds.... -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

On Thu, May 17, 2018 at 1:33 PM Chris Barker <chris.barker@noaa.gov> wrote:
On Thu, May 17, 2018 at 10:14 AM, Alexander Belopolsky <
alexander.belopolsky@gmail.com> wrote:
I am not sure I understand your question. All I said was that since PEP 495, it became possible to write a pair of functions to convert between TAI and UTC timestamps without any loss of information. For example, around the insertion of the last leap second at the end of 2016, we had the following sequence of seconds: TAI | UTC ---------------------+-------------------- 2016-12-31T23:59:35 | 2016-12-31T23:59:59 2016-12-31T23:59:36 | 2016-12-31T23:59:60 2016-12-31T23:59:37 | 2016-01-01T00:00:00 this correspondence can be implemented in Python using the following datetime objects: TAI | UTC -------------------------------+------------------------------------------- datetime(2016,12,31,23,59,35) | datetime(2016,12,31,23,59,59) datetime(2016,12,31,23,59,36) | datetime(2016,12,31,23,59,59,fold=1) datetime(2016,12,31,23,59,37) | datetime(2016,1,1,0,0,0) Of course, Python will treat datetime(2016,12,31,23,59,59) and datetime( 2016,12,31,23,59,59,fold=1)as equal, but you should be able to use your utc_to_tai(t) function to translate to TAI, do the arithmetic there and translate back with the tai_to_utc(t) function. Wherever tai_to_utc(t) returns a datetime instance with fold=1, you should add that to the seconds field before displaying.
By the same logic the standard library datetime does not support any local time because it does not include the timezone database. This is where the 3rd party developers should fill the gap.

[Chris Barker]
Does that support the other way -- or do we never lose a leap second anyway? (showing ignorance here)
Alexander covered the Python part of this, so I'll answer the possible higher-level question: we haven't yet needed a "negative" leap second, and it's considered unlikely (but not impossible) that we ever will. That's because the Earth's rotation is inexorably slowing ,so the mean solar day inexorably lengthens when measured by SI seconds. Other things can cause the Earth's rotation to speed up temporarily (like some major geological events), but they've only been able to overcome factors acting to slow rotation for brief periods, and never yet got near to overcoming them by a full second.

On 05/17/2018 12:13 PM, Tim Peters wrote:
How long before the earth stops rotating? When it does, will we be tide-locked with the sun, or will an earth day become an earth year? Inquiring-minds-want-to-know'ly yrs; -- ~Ethan~

On Fri, May 18, 2018 at 5:53 AM, Ethan Furman <ethan@stoneleaf.us> wrote:
Won't ever happen. A few thousand years ago, the planet heard the adage "one good turn deserves another", and interpreted it as an infinite loop. ChrisA

Ethan Furman wrote:
How long before the earth stops rotating?
Apparently about 1.9 trillion years.
When it does, will we be tide-locked with the sun, or will an earth day become an earth year?
Wikipedia says the main cause of the slowing is tidal effects from the moon, so probably it would become tide-locked with the moon and then not slow any further. Having a month-long day ought to make our current fears about climate change look like a lot of panic over nothing. However, the good news is that we won't have to worry about it. The sun will become a red giant and swallow the earth long before then. -- Greg

Greg Ewing schrieb am 18.05.2018 um 10:05:
So, does that mean we now need to hold our breath for 1.9 british trillion years or 1.9 american trillion years? Assuming you were referring to the French-Latin-Arabic based numbers and naming systems at all, that is... And anyway, what's that point doing there, right between the "1" and the "9" ? Stefan

Stefan Behnel wrote:
So, does that mean we now need to hold our breath for 1.9 british trillion years or 1.9 american trillion years?
Seeing as the time-to-red-giant is only about 5e9 years, I don't think it matters much either way. -- Greg

now we really have gotten OT... But thanks! that was my question! -CHB Alexander covered the Python part of this, so I'll answer the possible
-- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

On Thu, May 17, 2018 at 3:13 PM Tim Peters <tim.peters@gmail.com> wrote:
No, I did not. I did not realize that the question was about skipping a second instead of inserting it. Yes, regardless of whether it is possible given the physics of Earth rotation, negative leap seconds can be supported. They simply become "gaps" in PEP 495 terminology. Check out PEP 495 and read "second" whenever you see "hour". :-)

AstroPy solves for leap seconds [1][2] according to the IAU ERFA (SOFA) library [3] and the IERS-B and IERS-A tables [4]. IERS-B tables ship with AstroPy. The latest IERS-A tables ("from 1973 though one year into the future") auto-download on first use [5]. [1] http://docs.astropy.org/en/stable/time/#time-scales-for-time-deltas [2] http://docs.astropy.org/en/stable/time/#writing-a-custom-format [3] "Leap second day utc2tai interpolation" https://github.com/astropy/astropy/issues/5369 [4] https://github.com/astropy/astropy/pull/4436 [5] http://docs.astropy.org/en/stable/utils/iers.html On Thursday, May 17, 2018, Alexander Belopolsky < alexander.belopolsky@gmail.com> wrote:

https://en.wikipedia.org/wiki/Leap_second :
On Thursday, May 17, 2018, Wes Turner <wes.turner@gmail.com> wrote:

On Thu, May 17, 2018 at 7:12 PM Wes Turner <wes.turner@gmail.com> wrote:
I've just tried it. Unfortunately, it does not seem to be compatible with PEP 495 datetime yet:
Maybe someone can propose a feature for astropy to return datetime(2016,12,31,23,59,59,fold=1) in this case.

On Thu, May 17, 2018 at 9:49 AM, Chris Barker via Python-ideas <python-ideas@python.org> wrote:
Not really. There are multiple time standards in use. Atomic clocks count the duration of time – from their point of view, every second is the same (modulo relativistic effects). TAI is the international standard based on using atomic clocks to count seconds since a fixed starting point, at mean sea level on Earth. Another approach is to declare that each day (defined as "the time between the sun passing directly overhead the Greenwich Observatory twice") is 24 * 60 * 60 seconds long. This is what UT1 does. The downside is that since the earth's rotation varies over time, this means that the duration of a UT1 second varies from day to day in ways that are hard to estimate precisely. UTC is defined as a hybrid of these two approaches: it uses the same seconds as TAI, but every once in a while we add or remove a leap second to keep it roughly aligned with UT1. This is the time standard that computers use the vast majority of the time. Importantly, since we only ever add or remove an integer number of seconds, and only at the boundary in between seconds, UTC is defined just as precisely as TAI. So if you're trying to measure time using UT1 then yeah, your computer clock is wrong all the time by up to 0.9 seconds, and we don't even know what UT1 is more precisely than ~milliseconds. Generally it gets slightly more accurate just after a leap second, but it's not very precise either before or after. Which is why no-one does this. But if you're trying to measure time using UTC, then computers with the appropriate setup (e.g. at CERN, or in HFT data centers) routinely have clocks accurate to <1 microsecond, and leap seconds don't affect that at all. The datetime module still isn't appropriate for doing precise calculations over periods long enough to include a leap second though, e.g. Python simply doesn't know how many seconds passed between two arbitrary UTC timestamps, even if they were in the past. -n -- Nathaniel J. Smith -- https://vorpus.org

In fairness, Pandas, datetime64, and Arrow are really the same thing. I don't know about Pendulum or Delorean. A common standard would be great, or at least strong interoperability. I'm sure the authors of those projects would want that... Arrow is entirely about interoperability, after all. On Thu, May 10, 2018, 7:11 PM Ethan Furman <ethan@stoneleaf.us> wrote:

You don't mention the option of allowing time.microseconds to be a float, and I was curious about that since if it did work, then that might be a relatively smooth extension of the current API. The highest value you'd store in the microseconds field is 1e6, and at values around 1e6, double-precision floating point has precision of about 1e-10: In [8]: 1e6 - np.nextafter(1e6, 0) Out[8]: 1.1641532182693481e-10 So that could represent values to precision of ~0.116 femtoseconds, or 116 attoseconds. Too bad. Femtosecond precision would cover a lot of cases, if you really need attoseconds then it won't work. -n On Thu, May 10, 2018 at 1:30 PM, Ed Page <ed.page@ni.com> wrote:
-- Nathaniel J. Smith -- https://vorpus.org

This feels specialized enough to belong in a third party library. If that library can behave as transparently as possible interacting with Python datetime, so much the better. But the need is niche enough I don't think it belongs in standard library. ... this as someone who actually worked in a lab that measured MD simulations in attoseconds. I do understand the purpose. On Thu, May 10, 2018, 2:00 PM Ed Page <ed.page@ni.com> wrote:

Is there interest in a PEP for extending time, datetime / timedelta for arbitrary or extended precision fractional seconds?
Having seen the utter disaster that similar ideas brought to numpy, I would say: no. On the other hand, nanoseconds are slowly making their way to the stdlib and to add nanoseconds to datetime we only need a fully backward compatible implementation, not even a PEP. See <https://bugs.python.org/issue15443>.

On Thu, May 10, 2018 at 6:13 PM, Alexander Belopolsky < alexander.belopolsky@gmail.com> wrote:
I'm not sure the "disaster" was due to this idea.... nor, frankly, is datetime64 a disaster at all, though certainly far from perfect. But my question is whether high precision timedeltas belongs with "calendar time" at all. What with UTC and leap seconds, and all that, it gets pretty ugly, when down to the second or sub-second, what a given datetime really means. If I were to work with high precision measurements, experiments, etc, I'd use a "nanoseconds since" representation, where the "epoch" would likely be the beginning of the experiment, of something relevant. Note that this issued in netcdf CF formats, datetimes are expressed in things like: "hours since 1970-01-01:00:00" granted, it's mostly so that the values can be stored as an array of a simple scalars, but it does allow precision and an epoch that are suited to the data at hand. NOTE: One source of the "disaster" of numpy's datetime64 is you can set teh precision, but NOT the epoch -- which is kind of problematic if you really want femtosecond precision for something not in 1970 :-) -CHB
-- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

On Tue, May 15, 2018 at 2:05 AM, Chris Barker via Python-ideas <python-ideas@python.org> wrote:
UTC and leap seconds aren't a problem. When there's a leap second, you have 23:59:60 (or you repeat 23:59:59, if you can't handle second #60). That's pretty straight-forward, perfectly well-defined. No, the REAL problems come from relativity.....
That's an unrelated form of time calculation. For that kind of thing, you probably want to ignore calendars and use some form of monotonic time; but also, if you want to go to (or below) nanosecond resolution, you'll need your clock to actually be that accurate, which most likely means you're not using a computer's clock. Femtosecond timestamping would basically be just taking numbers given to you by an external device and using them as sequence points - clocks and calendars become irrelevant. The numbers might as well be frame numbers in a super-high-speed filming of the event. ChrisA

UTC and leap seconds aren't a problem.
Of course they are a problem— why else would they not be implemented in datetime? But my point if that given datetimestamp or calculation could be off by a second or so depending on whether and how leap seconds are implemented. It just doesn’t seem like a good idea to be handling months and femptoseconds with the same “encoding” -CHB

Chris is certainly right. A program that deals with femtosecond intervals should almost surely start by defining a "start of experiment" epoch where microseconds are fine. Then within that epoch, events should be monotonic integers for when measured or calculated times are marked. I can easily see reasons why a specialized wrapped int for FemtosecondsFromStart could be useful. But that's still a specialized need for a third party library. One possible use of this class might be to interoperate with datetimes or timedeltas. Conceivably sick interoperability could be dealing with leap seconds when needed. But "experiment time" should be a simple monotonic and uniform counter. On Mon, May 14, 2018, 6:35 PM Chris Barker - NOAA Federal via Python-ideas < python-ideas@python.org> wrote:

On Mon, 14 May 2018 at 12:17 Chris Angelico <rosuav@gmail.com> wrote:
I'm sure that the issue of "what do you call the leap second itself" is not the problem that Chris Barker is referring to. The problem with leap seconds is that they create unpredictable differences between UTC and real elapsed time. You can represent a timedelta of exactly 10^8 seconds, but if you add it to the current time, what should you get? What UTC time will it be in 10^8 real-time seconds? You don't know, and neither does anybody else, because you don't know how many leap seconds will occur in that time. The ways to resolve this problem are: (1) fudge the definition of "exactly 10^8 seconds" to disregard any leap seconds that occur in that time interval in the real world, making it not so exact anymore (2) use TAI instead of UTC, as GPS systems do (3) leave the relationship between time deltas and calendar time undefined, as some in this thread are suggesting

On Tue, May 15, 2018 at 11:21 AM, Rob Speer <rspeer@luminoso.com> wrote:
indeed -- even if you only care about the past, where you *could* know the leap seconds -- they are, by their very nature, of second precision -- which means right before leap second occurs, your "time" could be off by up to a second (or a half second?) It's kind of like using a carpenter's tape measure to to locate points from a electron microscope scan :-) The other issue with leap-seconds is that python's datetime doesn't support them :-) And neither do most date-time libraries. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

On Thu, May 17, 2018 at 12:56 PM Chris Barker via Python-ideas < python-ideas@python.org> wrote:
The other issue with leap-seconds is that python's datetime doesn't support them :-)
That's not entirely true. Since the implementation of PEP 495, it is possible to represent the 23:59:60 as 23:59:59 with the "fold" bit set. Of course, the repeated 23:59:59 will be displayed and behave exactly the same as the first 23:59:59, but a 3rd party library can be written to take the "fold" bit into account in temporal operations.

On Thu, May 17, 2018 at 10:14 AM, Alexander Belopolsky < alexander.belopolsky@gmail.com> wrote:
Does that support the other way -- or do we never lose a leap second anyway? (showing ignorance here) But still, now datetime *could* support leap seconds (which is nice, because before, 23:59:60 was illegal, so it couldn't even be done at all), but that doesn't mean that it DOES support leap seconds.... -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

On Thu, May 17, 2018 at 1:33 PM Chris Barker <chris.barker@noaa.gov> wrote:
On Thu, May 17, 2018 at 10:14 AM, Alexander Belopolsky <
alexander.belopolsky@gmail.com> wrote:
I am not sure I understand your question. All I said was that since PEP 495, it became possible to write a pair of functions to convert between TAI and UTC timestamps without any loss of information. For example, around the insertion of the last leap second at the end of 2016, we had the following sequence of seconds: TAI | UTC ---------------------+-------------------- 2016-12-31T23:59:35 | 2016-12-31T23:59:59 2016-12-31T23:59:36 | 2016-12-31T23:59:60 2016-12-31T23:59:37 | 2016-01-01T00:00:00 this correspondence can be implemented in Python using the following datetime objects: TAI | UTC -------------------------------+------------------------------------------- datetime(2016,12,31,23,59,35) | datetime(2016,12,31,23,59,59) datetime(2016,12,31,23,59,36) | datetime(2016,12,31,23,59,59,fold=1) datetime(2016,12,31,23,59,37) | datetime(2016,1,1,0,0,0) Of course, Python will treat datetime(2016,12,31,23,59,59) and datetime( 2016,12,31,23,59,59,fold=1)as equal, but you should be able to use your utc_to_tai(t) function to translate to TAI, do the arithmetic there and translate back with the tai_to_utc(t) function. Wherever tai_to_utc(t) returns a datetime instance with fold=1, you should add that to the seconds field before displaying.
By the same logic the standard library datetime does not support any local time because it does not include the timezone database. This is where the 3rd party developers should fill the gap.

[Chris Barker]
Does that support the other way -- or do we never lose a leap second anyway? (showing ignorance here)
Alexander covered the Python part of this, so I'll answer the possible higher-level question: we haven't yet needed a "negative" leap second, and it's considered unlikely (but not impossible) that we ever will. That's because the Earth's rotation is inexorably slowing ,so the mean solar day inexorably lengthens when measured by SI seconds. Other things can cause the Earth's rotation to speed up temporarily (like some major geological events), but they've only been able to overcome factors acting to slow rotation for brief periods, and never yet got near to overcoming them by a full second.

On 05/17/2018 12:13 PM, Tim Peters wrote:
How long before the earth stops rotating? When it does, will we be tide-locked with the sun, or will an earth day become an earth year? Inquiring-minds-want-to-know'ly yrs; -- ~Ethan~

On Fri, May 18, 2018 at 5:53 AM, Ethan Furman <ethan@stoneleaf.us> wrote:
Won't ever happen. A few thousand years ago, the planet heard the adage "one good turn deserves another", and interpreted it as an infinite loop. ChrisA

Ethan Furman wrote:
How long before the earth stops rotating?
Apparently about 1.9 trillion years.
When it does, will we be tide-locked with the sun, or will an earth day become an earth year?
Wikipedia says the main cause of the slowing is tidal effects from the moon, so probably it would become tide-locked with the moon and then not slow any further. Having a month-long day ought to make our current fears about climate change look like a lot of panic over nothing. However, the good news is that we won't have to worry about it. The sun will become a red giant and swallow the earth long before then. -- Greg

Greg Ewing schrieb am 18.05.2018 um 10:05:
So, does that mean we now need to hold our breath for 1.9 british trillion years or 1.9 american trillion years? Assuming you were referring to the French-Latin-Arabic based numbers and naming systems at all, that is... And anyway, what's that point doing there, right between the "1" and the "9" ? Stefan

Stefan Behnel wrote:
So, does that mean we now need to hold our breath for 1.9 british trillion years or 1.9 american trillion years?
Seeing as the time-to-red-giant is only about 5e9 years, I don't think it matters much either way. -- Greg

now we really have gotten OT... But thanks! that was my question! -CHB Alexander covered the Python part of this, so I'll answer the possible
-- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

On Thu, May 17, 2018 at 3:13 PM Tim Peters <tim.peters@gmail.com> wrote:
No, I did not. I did not realize that the question was about skipping a second instead of inserting it. Yes, regardless of whether it is possible given the physics of Earth rotation, negative leap seconds can be supported. They simply become "gaps" in PEP 495 terminology. Check out PEP 495 and read "second" whenever you see "hour". :-)

AstroPy solves for leap seconds [1][2] according to the IAU ERFA (SOFA) library [3] and the IERS-B and IERS-A tables [4]. IERS-B tables ship with AstroPy. The latest IERS-A tables ("from 1973 though one year into the future") auto-download on first use [5]. [1] http://docs.astropy.org/en/stable/time/#time-scales-for-time-deltas [2] http://docs.astropy.org/en/stable/time/#writing-a-custom-format [3] "Leap second day utc2tai interpolation" https://github.com/astropy/astropy/issues/5369 [4] https://github.com/astropy/astropy/pull/4436 [5] http://docs.astropy.org/en/stable/utils/iers.html On Thursday, May 17, 2018, Alexander Belopolsky < alexander.belopolsky@gmail.com> wrote:

https://en.wikipedia.org/wiki/Leap_second :
On Thursday, May 17, 2018, Wes Turner <wes.turner@gmail.com> wrote:

On Thu, May 17, 2018 at 7:12 PM Wes Turner <wes.turner@gmail.com> wrote:
I've just tried it. Unfortunately, it does not seem to be compatible with PEP 495 datetime yet:
Maybe someone can propose a feature for astropy to return datetime(2016,12,31,23,59,59,fold=1) in this case.

On Thu, May 17, 2018 at 9:49 AM, Chris Barker via Python-ideas <python-ideas@python.org> wrote:
Not really. There are multiple time standards in use. Atomic clocks count the duration of time – from their point of view, every second is the same (modulo relativistic effects). TAI is the international standard based on using atomic clocks to count seconds since a fixed starting point, at mean sea level on Earth. Another approach is to declare that each day (defined as "the time between the sun passing directly overhead the Greenwich Observatory twice") is 24 * 60 * 60 seconds long. This is what UT1 does. The downside is that since the earth's rotation varies over time, this means that the duration of a UT1 second varies from day to day in ways that are hard to estimate precisely. UTC is defined as a hybrid of these two approaches: it uses the same seconds as TAI, but every once in a while we add or remove a leap second to keep it roughly aligned with UT1. This is the time standard that computers use the vast majority of the time. Importantly, since we only ever add or remove an integer number of seconds, and only at the boundary in between seconds, UTC is defined just as precisely as TAI. So if you're trying to measure time using UT1 then yeah, your computer clock is wrong all the time by up to 0.9 seconds, and we don't even know what UT1 is more precisely than ~milliseconds. Generally it gets slightly more accurate just after a leap second, but it's not very precise either before or after. Which is why no-one does this. But if you're trying to measure time using UTC, then computers with the appropriate setup (e.g. at CERN, or in HFT data centers) routinely have clocks accurate to <1 microsecond, and leap seconds don't affect that at all. The datetime module still isn't appropriate for doing precise calculations over periods long enough to include a leap second though, e.g. Python simply doesn't know how many seconds passed between two arbitrary UTC timestamps, even if they were in the past. -n -- Nathaniel J. Smith -- https://vorpus.org

In fairness, Pandas, datetime64, and Arrow are really the same thing. I don't know about Pendulum or Delorean. A common standard would be great, or at least strong interoperability. I'm sure the authors of those projects would want that... Arrow is entirely about interoperability, after all. On Thu, May 10, 2018, 7:11 PM Ethan Furman <ethan@stoneleaf.us> wrote:

You don't mention the option of allowing time.microseconds to be a float, and I was curious about that since if it did work, then that might be a relatively smooth extension of the current API. The highest value you'd store in the microseconds field is 1e6, and at values around 1e6, double-precision floating point has precision of about 1e-10: In [8]: 1e6 - np.nextafter(1e6, 0) Out[8]: 1.1641532182693481e-10 So that could represent values to precision of ~0.116 femtoseconds, or 116 attoseconds. Too bad. Femtosecond precision would cover a lot of cases, if you really need attoseconds then it won't work. -n On Thu, May 10, 2018 at 1:30 PM, Ed Page <ed.page@ni.com> wrote:
-- Nathaniel J. Smith -- https://vorpus.org
participants (14)
-
Alexander Belopolsky
-
Chris Angelico
-
Chris Barker
-
Chris Barker - NOAA Federal
-
David Mertz
-
Ed Page
-
Ethan Furman
-
Greg Ewing
-
Guido van Rossum
-
Nathaniel Smith
-
Rob Speer
-
Stefan Behnel
-
Tim Peters
-
Wes Turner