in order to make reviewing PEP 576/580 easier and possibly take some
ideas from one PEP to the other, let me state the one fundamental
difference between these PEPs. There are many details in both PEPs that
can still change, so I'm focusing on what I think is the big structural
To be clear: I'm referring to the PEP 576 version at
(this really should be merged in the main PEP repo).
Both PEPs add a hook for fast calling of C functions. However, they do
that on a different level. Let's trace what _PyObject_FastCallKeywords()
currently does when acting on an instance of builtin_function_or_method:
D. the actual C function (*ml_meth)()
PEP 576 hooks the call A->B while PEP 580 hooks the call B->D (getting
rid of C).
Advantages of the high-level hook (PEP 576):
* Much simpler protocol than PEP 580.
* More general since B can be anything.
* Not being forced to deal with "self".
* Slightly faster when you don't care about B.
Advantages of the low-level hook (PEP 580):
* No need to duplicate the code from B (see the various existing
* Enables certain optimizations because other code can make assumptions
about what B does.
In my personal opinion, the last advantage of PEP 580 is really
important: some existing optimizations depend on it and it also allows
extending the protocol in a "performance-compatible" way: it's easy to
extend the protocol in a way that callers can benefit from it.
Anyway, it would be good to have some guidance on how to proceed here. I
would really like something like PEP 580 to be accepted and I'm willing
to put time and effort into achieving that.
On Wed, Jun 27, 2018 at 12:01 PM, Eric V. Smith <eric(a)trueblade.com> wrote:
> >>> def test():
> >>> spam = 1
> >>> ham = 2
> >>> vars = [key1+key2 for key1 in locals() for key2 in locals()]
> >>> return vars
> >>> Wanna guess what that's gonna return?
Guessing aside, messing around with locals() isn't really helpful for the
usual case of common code.
But the real problem I see with all of this the distinction between
generator expressions and comprehensions:
an assignment expression occurring in a list, set or dict comprehension or
in a generator expression (below collectively referred to as
"comprehensions") binds the target in the containing scope, honoring a
nonlocal or global declaration for the target in that scope, if one exists.
For the purpose of this rule the containing scope of a nested comprehension
is the scope that contains the outermost comprehension. A lambda counts as
a containing scope.
It seems everyone agrees that scoping rules should be the same for
generator expressions and comprehensions, which is a good reason for
python3's non-leaking comprehensions:
In [*5*]: i = 0
In [*6*]: l = [i *for* i *in* range(3)]
In [*7*]: i
In [*8*]: i = 0
In [*9*]: g = (i *for* i *in* range(3))
In [*10*]: i
In [*11*]: *for* j *in* g:
In [*12*]: i
so comprehensions and generator expressions behave differently -- not great.
In [*4*]: i = 0
In [*5*]: l = [i *for* i *in* range(3)]
In [*6*]: i
In [*7*]: g = (i *for* i *in* range(3))
In [*8*]: i
In [*9*]: list(g)
Out[*9*]: [0, 1, 2]
In [*10*]: i
The loop name doesn't "leak" and comprehensions and generator expressions
are the same this regard -- nice.
So what about:
l = [x:=i for i in range(3)]
g = (x:=i for i in range(3))
Is there any way to keep these consistent if the "x" is in the regular
Note that this thread is titled "Informal educator feedback on PEP 572".
As an educator -- this is looking harder an harder to explain to newbies...
Though easier if any assignments made in a "comprehension" don't "leak out".
Which does not mean that we'd need a "proper" new local scope (i.e.
locals() returning something new) -- as long as the common usage was
>> I'm not singling out Chris here, but these discussions would be easier
> >> to follow and more illuminating if the answers to such puzzles were
> >> presented when they're posed.
well, I think the point there was that it wasn't obvious without running
the code -- and that point is made regardless of the answer.
Christopher Barker, Ph.D.
Emergency Response Division
NOAA/NOS/OR&R (206) 526-6959 voice
7600 Sand Point Way NE (206) 526-6329 fax
Seattle, WA 98115 (206) 526-6317 main reception
PEP 544 specifies this address as "Discussions-To" so I hope I'm at the
I think protocols as defined in the PEP are a very interesting idea and I'm
thinking of ways of applying them. The first use case is in the context of
attrs has a number of functions that work only on attrs classes; asdict for
example (turns an attrs class into a dictionary recursively). We can't
really type hint this properly using nominal subtyping since attrs classes
don't have an exclusive ancestor. But it sounds like we could use
An attrs class has a special class-level field, __attrs_attrs__, which
holds the attribute definitions. So maybe we can define a protocol:
__attrs_attrs__: ClassVar[Tuple[Attribute, ...]]
then we could define asdict as (simplified):
def asdict(inst: AttrsClass) -> Dict[str, Any]:
and it should work out. My question is how to actually add this protocol to
attrs classes. Now, we currently have an attrs plugin in mypy so maybe some
magic in there could make it happen in this particular case.
My second use case is a small library I've developed for work, which
basically wraps attrs and generates and sticks methods on a class for
serialization/deserialization. Consider the following short program, which
does not typecheck on the current mypy.
def __serialize__(self) -> int:
def make_serializable(cl: Type) -> Type:
cl = attr.s(cl)
cl.__serialize__ = lambda self: 1
a: int = attr.ib()
def serialize(inst: Serializable) -> int:
error: Argument 1 to "serialize" has incompatible type "A"; expected
error: Too many arguments for "A"
I have no desire to write a mypy plugin for this library. So I guess what
is needed is a way to annotate the class decorator, telling mypy it's
adding a protocol to a class. It seems to have trouble getting to this
conclusion by itself. (The second error implies the attrs plugin doesn't
handle wrapping attr.s, which is unfortunate but a different issue.)
I have found this pattern of decorating classes and enriching them with
additional methods at run-time really powerful, especially when used with
run-time parsing of type information (that often gets a bad rep on this
list, I've noticed :) The structural typing subsystem seems like a good fit
for use cases like this, and I think it'd be a shame if we couldn't make it
I'm looking for someone in the python community to help with a problem
of anti-patterns showing up dealing with SIGPIPE.
Specifically I've noticed an anti-pattern developing where folks will
try to suppress broken pipe errors written to stdout by setting
SIGPIPE's disposition to SIG_DFL. This is actually very common, and
also rather broken due to the fact that for all but the most simple text
filters this opens up a problem where the process can exit unexpectedly
due to SIGPIPE being generated from a remote connection the program makes.
I have attached a test program which shows the problem.
to use this program it takes several args.
# 1. Illustrate the 'ugly output to stderr' that folks want to avoid:
% python3 t0.py nocatch | head -1
# 2. Illustrate the anti-pattern, the program exits on about line 47
which most folks to not understand
% python3 t0.py dfl | head -1
# 3. Show a better solution where we catch the pipe error and cleanup to
avoid the message:
% python3 t0.py | head -1
I did a recent audit of a few code bases and saw this pattern pop often
often enough that I am asking if there's a way we can discourage the use
of "signal(SIGPIPE, SIG_DFL)" unless the user really understands what
they are doing.
I do have a pull req here: https://github.com/python/cpython/pull/6773
where I am trying to document this on the signal page, but I can't sort
out how to land this doc change.