Before installing python 3.5.3, i performed the test cases provided with
the source tar ball.
But 3 tests failed. I figured out the reason for other 2. But 1 test case
is out of my understanding.
Re-running test 'test_xmlrpc_net' in verbose mode
test_python_builders (test.test_xmlrpc_net.PythonBuildersTest) ... ERROR
ERROR: test_python_builders (test.test_xmlrpc_net.PythonBuildersTest)
Traceback (most recent call last):
File "/pkgs/Python-3.5.3/Lib/test/test_xmlrpc_net.py", line 17, in
builders = server.getAllBuilders()
File "/pkgs/Python-3.5.3/Lib/xmlrpc/client.py", line 1092, in __call__
return self.__send(self.__name, args)
File "/pkgs/Python-3.5.3/Lib/xmlrpc/client.py", line 1432, in __request
File "/pkgs/Python-3.5.3/Lib/xmlrpc/client.py", line 1134, in request
return self.single_request(host, handler, request_body, verbose)
File "/pkgs/Python-3.5.3/Lib/xmlrpc/client.py", line 1167, in
xmlrpc.client.ProtocolError: <ProtocolError for
buildbot.python.org/all/xmlrpc/: 503 Service Unavailable>
Ran 1 test in 0.215s
test test_xmlrpc_net failed
Can somebody highlight the reason of failure.
I just came across a code snippet that
would define a method with the "__dict__" name - like in:
The resulting class's instances can be assigned
dynamic attributes as usual, but one can never acess
its actual local variables through instance.__dict__ -
the method is retrieved instead. Calling "vars" will also fail
on objects of this class.
This behavior is weird, and I believe is actually a side-effect
of implementation details on CPython.
I am not sure whether it shoud just:
1 - be left as is - whoever reuses __dict__ as a method had it coming
2 - document CPythn behavior
3 - file that as a bug to disallow __dict__ override in class declaration
4 - file that as a bug to not-create class __dict__ when one is explictly
created in Python code (the same that happens when one have "__slots__".
I have the feeling that (1) is just good - but then, I am at least
posting this e-mail here.
Similar weird things go when one creates a method named "__class__",
and possible other names.
(I just checked that pypy3 mimics the behavior)
The deadline is a week from today, April 18th 2018. Original
It’s that time again: time to start thinking about the Python Language
Summit! The 2018 summit will be held on Wednesday, May 9, from 10am to
4pm, at the Huntington Convention Center in Cleveland, Ohio, USA. Your
befezzled and befuddled hosts Barry and Larry will once more be behind
the big desk in front.
The summit’s purpose is to disseminate information and spark
conversation among core Python developers. It’s our yearly opportunity
to get together for an in-person discussion, to review interesting
developments of the previous year and hash out where we’re going next.
And we have lots to talk about! 3.7 is in beta, and we've all
collectively started work on 3.8 too.
As before, we’re using Google Forms to collect signups. Signups are
open now; the deadline to sign up is Wednesday April 18th, 2018 (AoE).
Please do us a favor and sign up sooner rather than later. The signup
form is simpler this year--I bet most people can be done in less than
One difference from last year: there are now *two* forms. The first
form is for signing up to attend (the "Request For Invitation" form),
and the second form is for proposing a talk. Please note: if you want to
present, you still need to fill out the Request For Invitation form
too. (Yes, it's more complicated this way, sorry. But having both on
the same form kind of enforced a one-to-one mapping, and it's really a
many-to-many mapping: one person might propose multiple talks, and one
talk might have multiple presenters. Overall this should be less
You can find links to *both* forms on the official Python Language
Summit 2018 page:
A few notes:
* There will be lightning talks! Signups will only be available
during the Language Summit itself.
* You don’t need to be registered for PyCon in order to attend the summit!
* We’ll have badge ribbons for Language Summit participants, which
we’ll hand out at the summit room in the morning.
* We're inviting Jake Edge from Linux Weekly News to attend the summit
and provide press coverage again. Jake’s done a phenomenal job of
covering the last few summits, providing valuable information not
just for summit attendees, but also for the Python community at
large. Jake’s coverage goes a long way toward demystifying the
summit, while remaining respectful of confidential information
that’s deemed “off the record” ahead of time by participants.
One big final note (please read this!):
When using Google Forms, you /can/ edit your responses later! When
you fill out the form and hit Submit, the submission complete page
(the one that says "Thanks for playing!") will have a link on it
labeled "Edit your response". BOOKMARK THIS LINK! You can use this
link at /any time/ to edit your response, up to the point that
signups close on April 18th. Keep in mind, you'll need to bookmark
each response independently: once for signing up to attend ("Request
For Invitation"), and once for each talk proposal you submit.
Again, /please/ be sure to save this bookmark yourself--we don't
know how to find the link for you later if you don't save it.
We hope to see you at the summit!
On 2018-04-10 13:49, Nick Coghlan wrote:
> If it's only a semantic level change in the way the macro gets
> expanded, then whether or not it needs an ABI version guard gets
> judged on a case-by-case basis, and in this particular case, my view
> would be that developers should be able to write extensions using the
> stable ABI that accept function subclasses on 3.8+, without having to
> *require* the use of 3.8+ to import their module.
I don't really get this paragraph, but in any case I decided to *not*
change PyCFunction_Check in PEP 575. It doesn't seem worth the trouble
as this macro is probably not often used anyway. Also, it's hard to
guess what it should be replaced with: why would extensions be calling
On 2018-04-08 05:17, Nick Coghlan wrote:
> Changing macro definitions doesn't break the stable ABI, as long as
> the *old* macro expansions still do the right thing.
To me, it looks like a bad idea to change macros. Imagine that the
PyCFunction_Check macro changes in Python 3.8. Then an extension module
compiled on 3.7 (but run on 3.8) would behave differently from the same
extension compiled on 3.8. I cannot imagine that this is in line with
the "stable ABI" philosophy.
It's been a long while since I rebuilt Python from the Git source. I
tried for the first time the other day. Everything passed except
test_poplib and test_asyncio. The former just runs and runs and runs.
Here's the first traceback I encounter when executing ./python
test_stls_context (__main__.TestPOP3Class) ... Exception in thread Thread-16:
Traceback (most recent call last):
File "/home/skip/src/python/cpython/Lib/threading.py", line 917, in
File "Lib/test/test_poplib.py", line 227, in run
File "/home/skip/src/python/cpython/Lib/asyncore.py", line 207, in loop
File "/home/skip/src/python/cpython/Lib/asyncore.py", line 150, in poll
File "/home/skip/src/python/cpython/Lib/asyncore.py", line 87, in read
File "/home/skip/src/python/cpython/Lib/asyncore.py", line 83, in read
File "/home/skip/src/python/cpython/Lib/asyncore.py", line 422, in
File "Lib/test/test_poplib.py", line 194, in handle_read
File "Lib/test/test_poplib.py", line 174, in _do_tls_handshake
File "/home/skip/src/python/cpython/Lib/ssl.py", line 1108, in do_handshake
ssl.SSLError: [SSL: SSLV3_ALERT_CERTIFICATE_UNKNOWN] sslv3 alert
certificate unknown (_ssl.c:1049)
I thought perhaps I was missing something, but _ssl built just fine. I
believe I have a recent enough version of libssl (1.0.2g), and I'm on
a pretty vanilla system (up-to-date Ubuntu 17.10). SSL-wise:
% apt search ssl | grep installed | egrep '^lib' | egrep ssl
libgnutls-openssl27/artful,now 3.5.8-6ubuntu3 amd64 [installed]
libio-socket-ssl-perl/artful,artful,now 2.050-1 all [installed]
libnet-smtp-ssl-perl/artful,artful,now 1.04-1 all [installed]
libnet-ssleay-perl/artful,now 1.80-1build1 amd64 [installed]
libssl-dev/artful-updates,artful-security,now 1.0.2g-1ubuntu13.4 amd64
1.0.2g-1ubuntu13.4 all [installed,automatic]
Any clues about what I might be missing from my setup would be appreciated.
Not sure what was wrong with test_asyncio. I ran it in isolation and it passed.
I wonder if it's useful to update the cgitb module, in particular the
I see some possible improvements:
1. In both text and html versions:
When a module is called, there are no parameters (displayed as '()'). I
think they are unnecessary. Perhaps the parentheses should be removed?
Perhaps it's better to keep them for backward compatibility?
### example for the text version ###
$ python3 demo.py
/home/stephane/src/cgitest/demo.py in <module>()
7 def func1(a, b):
### end of example ###
2. In html version only:
a. If the executed code is in <module>: in this case, it is not shown
in the html version because the square brackets are interpreted as a
html tag (see the picture in attachement).
b. Update the style of the html or/and using html5. It would be
prettier but it will be a big change for probably too few benefits.
What do you think about them? I can report bugs and send pull-requests
for them but I would prefer to get feedbacks before.
On 7 April 2018 at 04:13, Steve Dower <steve.dower(a)python.org> wrote:
> Better to deprecate it before it becomes broken, in my opinion.
> Having someone willing and able to review and merge changes is the best
> criteria for whether a module is still supported or not.
I think there's a difference between not being willing to add
enhancements, and not fixing bugs. The issue that originally triggered
this discussion was an enhancement request, and I don't think it's
unreasonable to declare cmd as "stable - no further enhancements will
be made or accepted" while still considering it as supported for
bugfixes. If significant bugs in cmd are remaining unfixed, then
that's a somewhat different matter.
The fact that pdb uses it, and the advantage of having something in
the stdlib for users without easy access to "pip install", *plus* the
general principle of "if it isn't broken, don't fix it" make me feel
that the best solution would be to document that extended replacements
such as cmd2 exist in PyPI, but retain cmd as supported but not (in
principle) accepting further enhancements (leaving the door open for
interested core devs to merge enhancements on a case by case basis if
they have a personal interest in doing so).
On 2018-04-04 17:56, Guido van Rossum wrote:
> It would be helpful if you explained the context of your request.
The context is PEP 575. I guess my question is mostly about
PyCFunction_Check(). I will not be able to keep it 100% backwards
compatible simply because the goal of that PEP is precisely changing the
classes of some objects.
Now the question is: am I allowed to change the implementation of
PyCFunction_Check()? If it's considered part of the stable ABI, then the
answer is immediately "no".
By the way, does anybody happen to know why the PyCFunction_* functions
are undocumented? Is it just an oversight in the docs or is it intentional?
But regardless of the context, I think that the question "Are
undocumented functions part of the stable ABI?" should be answered in