Hi,
Ned Deily closed old bugs reported on the macOS Tiger buildbot, since
this buildbot has been retired 3 months ago (the builders are still
visible online, but last builds were 3 months ago).
It seems like the oldest macOS buildbot is now macOS El Capitan (macOS
10.11, 2015). Does it mean that the minimum officially supported macOS
version is now macOS 10.11 El Capitain?
For me, to get an official "full" support, we need a buildbot. Without
buildbot, we can only provide a weaker "best-effort" support.
Otherwise, the risk of regression is too high.
I failed to find any official and obvious list of CPython supported
platforms, so I wrote my own list:
http://vstinner.readthedocs.io/cpython.html#supported-platforms
My first motivation for this list was to get a simple list of
supported Windows versions, because I'm unable to follow Windows
lifecycle (the PEP 11 has a vague statement about Windows which
requires to follow Windows end of life for each Windows release).
Victor
Hi all,
I agree that compression is often a good idea when moving serialized
objects around on a network, but for what it's worth I as a library author
would always set compress=False and then handle it myself as a separate
step. There are a few reasons for this:
1. Bandwidth is often pretty good, especially intra-node, on high
performance networks, or on decent modern discs (NVMe)
2. I often use different compression technologies in different
situations. LZ4 is a great all-around default, but often snappy, blosc, or
z-standrad are better suited. This depends strongly on the characteristics
of the data.
3. Very often data often isn't compressible, or is already in some
compressed form, such as in images, and so compressing only hurts you.
In general, my thought is that compression is a complex topic with enough
intricaces that setting a single sane default that works 70+% of the time
probably isn't possible (at least not with the applications that I get
exposed to).
Instead of baking a particular method into pickle.dumps I would recommend
trying to solve this problem through documentation, pointing users to the
various compression libraries within the broader Python ecosystem, and
perhaps pointing to one of the many blogposts that discuss their strengths
and weaknesses.
Best,
-matt
Hi,
While PEP 574 (pickle protocol 5 with out-of-band data) is still in
draft status, I've made available an implementation in branch "pickle5"
in my GitHub fork of CPython:
https://github.com/pitrou/cpython/tree/pickle5
Also I've published an experimental backport on PyPI, for Python 3.6
and 3.7. This should help people play with the new API and features
without having to compile Python:
https://pypi.org/project/pickle5/
Any feedback is welcome.
Regards
Antoine.
On the 3.7 branch, "make test" routinely fails to terminate. (Pretty
vanilla Ubuntu 17.10 running on a Dell Laptop. Nothing esoteric at all)
Lately, it's been one of the multiprocessing tests. After a long while
(~2000 seconds), I kill it, then it complains many times about lack of a
valid_signals attribute in the signal module:
======================================================================
ERROR: test_remove_signal_handler_error2
(test.test_asyncio.test_unix_events.SelectorEventLoopSignalTests)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/skip/src/python/cpython/Lib/unittest/mock.py", line 1191, in
patched
return func(*args, **keywargs)
File
"/home/skip/src/python/cpython/Lib/test/test_asyncio/test_unix_events.py",
line 219, in test_remove_signal_handler_error2
m_signal.valid_signals = signal.valid_signals
AttributeError: module 'signal' has no attribute 'valid_signals'
----------------------------------------------------------------------
Ran 1967 tests in 36.058s
FAILED (errors=362, skipped=11)
test test_asyncio failed
/home/skip/src/python/cpython/Lib/asyncio/base_events.py:605:
ResourceWarning: unclosed event loop <_UnixSelectorEventLoop running=False
closed=False debug=False>
source=self)
Re-running test 'test_signal' in verbose mode
then reruns test_signal in verbose mode.
Earlier today, a run succeeded, so I'm guessing a race condition exists in
the test system. I recall encountering a similar problem a few weeks ago
and discovered this open ticket:
https://bugs.python.org/issue33099
Should I expect this as the normal behavior?
Skip
This is it! We are down to THE FINAL WEEK for 3.7.0! Please get your
feature fixes, bug fixes, and documentation updates in before
2018-05-21 ~23:59 Anywhere on Earth (UTC-12:00). That's about 7 days
from now. We will then tag and produce the 3.7.0 release candidate.
Our goal continues been to be to have no changes between the release
candidate and final; AFTER NEXT WEEK'S RC1, CHANGES APPLIED TO THE 3.7
BRANCH WILL BE RELEASED IN 3.7.1. Please double-check that there are
no critical problems outstanding and that documentation for new
features in 3.7 is complete (including NEWS and What's New items), and
that 3.7 is getting exposure and tested with our various platorms and
third-party distributions and applications. Those of us who are
participating in the development sprints at PyCon US 2018 here in
Cleveland can feel the excitement building as we work through the
remaining issues, including completing the "What's New in 3.7"
document and final feature documentation. (We wish you could all be
here.)
As noted before, the ABI for 3.7.0 was frozen as of 3.7.0b3. You
should now be treating the 3.7 branch as if it were already released
and in maintenance mode. That means you should only push the kinds of
changes that are appropriate for a maintenance release:
non-ABI-changing bug and feature fixes and documentation updates. If
you find a problem that requires an ABI-altering or other significant
user-facing change (for example, something likely to introduce an
incompatibility with existing users' code or require rebuilding of
user extension modules), please make sure to set the b.p.o issue to
"release blocker" priority and describe there why you feel the change
is necessary. If you are reviewing PRs for 3.7 (and please do!), be on
the lookout for and flag potential incompatibilities (we've all made
them).
Thanks again for all of your hard work towards making 3.7.0 yet
another great release - coming to a website near you on 06-15!
Release Managerly Yours,
--Ned
https://www.python.org/dev/peps/pep-0537/
--
Ned Deily
nad(a)python.org -- []
Hi,
At the language summit this year, there was some discussion of PEP 575.
I wanted to simplify the PEP, but rather than modify that PEP, Nick
Coghlan encouraged me to write an alternative PEP instead.
PEP 576 aims to fulfill the same goals as PEP 575, but with fewer
changes and to be fully backwards compatible.
The PEP can be viewed here:
https://github.com/python/peps/blob/master/pep-0576.rst
Cheers,
Mark.
P.S.
I'm happy to have discussion of this PEP take place via GitHub,
rather than the mailing list, but I thought I would follow the
conventional route for now.
My GitHub fork of the cpython repo was made awhile ago, before a 3.7 branch
was created. I have no remotes/origin/3.7. Is there some way to create it
from remotes/upstream/3.7? I asked on GitHub's help forums. The only
recommendation was to to delete my fork and recreate it. That seemed kind
of drastic, and I will do it if that's really the only way, but this seems
like functionality Git and/or GitHub probably supports.
Thx,
Skip
For the record: the only reason that I replied on GitHub was because the
proposal was not yet posted (as far as I know) to any mailing list.
Typically, a post is made to a mailing list more or less at the same
time as creating the PEP. In this case, there was a delay of a few days,
maybe also because of unrelated issues with the compilation of the PEPs.
Please don't forgot to perform the following steps when add a new
public C API:
* Document it in Doc/c-api/.
* Add an entry in the What's New document.
* Add it in Doc/data/refcounts.dat.
* Add it in PC/python3.def.
If you want to include it in the limited API, wrap its declaration with:
#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 >= 0x03080000
#endif
(use the correct Python version of introducing a feature in the comparison)
If you don't want to include it in the limited API, wrap its declaration
with:
#ifndef Py_LIMITED_API
#endif
You are free of adding private C API, but its name should start with _Py
or _PY and its declaration should be wrapped with:
#ifndef Py_LIMITED_API
#endif