[Python-Dev] PEP 550 v4

Koos Zevenhoven k7hoven at gmail.com
Wed Sep 6 16:39:56 EDT 2017

On Wed, Sep 6, 2017 at 8:16 PM, Guido van Rossum <guido at python.org> wrote:

> On Wed, Sep 6, 2017 at 8:07 AM, Koos Zevenhoven <k7hoven at gmail.com> wrote:
>> I think yield from should have the same semantics as iterating over the
>> generator with next/send, and PEP 555 has no issues with this.
> I think the onus is on you and Greg to show a realistic example that shows
> why this is necessary.
​Well, regarding this part, it's just that things like

for obj in gen:
​    yield obj

often get modernized into

yield from gen

And realistic examples of that include pretty much any normal use of yield

So far all the argumentation about this has been of the form "if you have
> code that currently does this (example using foo) and you refactor it in
> using yield from (example using bar), and if you were relying on context
> propagation back out of calls, then it should still propagate out."
​So here's a realistic example, with the semantics of PEP 550 applied to a
decimal.setcontext() kind of thing, but it could be anything using

def process_data_buffers(​buffers):
    for buf in buffers:
        for data in buf:
            if data.tag == "NEW_PRECISION":
                yield compute(data)

Code smells? Yes, but maybe you often see much worse things, so let's say
it's fine.

​But then, if you refactor it into a subgenerator like this:

def process_data_buffer(buffer):
    for data in buf:
        if data.tag == "NEW_PRECISION":
            yield compute(data)

def process_data_buffers(​buffers):
    for buf in buffers:
        yield from buf

Now, if setcontext uses PEP 550 semantics, the refactoring broke the code,
because a generator introduce a scope barrier by adding a LogicalContext on
the stack, and setcontext is only local to the process_data_buffer
subroutine. But the programmer is puzzled, because with regular functions
it had worked just fine in a similar situation before they learned about

def process_data_buffer(buffer, output):
    for data in buf:
        if data.tag == "precision change":

def process_data_buffers(​buffers):
    output = []
    for buf in buffers:
        process_data_buffer(buf, output)

​In fact, this code had another problem, namely that the context state is
leaked out of process_d​ata_buffers, because PEP 550 leaks context state
out of functions, but not out of generators. But we can easily imagine that
the unit tests for process_data_buffers *do* pass.

But let's look at a user of the functionality:

def get_total():
    return sum(process_data_buffers(get_buffers()))

value = get_total() * compute_factor()

Now the code is broken, because setcontext(somecontext) has no effect,
because get_total() leaks out another context. Not to mention that our data
buffer source now has control over the behavior of compute_factor(). But if
one is lucky, the last line was written as

value = compute_factor() * get_total()

And hooray, the code works!

(Except for perhaps the code that is run after this.)

Now this was of course a completely fictional example, and hopefully I
didn't introduce any bugs or syntax errors other than the ones I described.
I haven't seen code like this anywhere, but somehow we caught the problems

-- Koos

+ Koos Zevenhoven + http://twitter.com/k7hoven +
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20170906/4f024087/attachment-0001.html>

More information about the Python-Dev mailing list