On Mon, Oct 9, 2017 at 9:46 AM, Nick Coghlan firstname.lastname@example.org wrote:
On 8 October 2017 at 08:40, Koos Zevenhoven email@example.com wrote:
On Sun, Oct 8, 2017 at 12:16 AM, Nathaniel Smith firstname.lastname@example.org wrote:
On Oct 7, 2017 12:20, "Koos Zevenhoven" email@example.com wrote:
Unfortunately, we actually need a third kind of generator semantics, something like this:
@contextvars.caller_context def genfunc(): assert cvar.value is the_value yield assert cvar.value is the_value
with cvar.assign(the_value): gen = genfunc()
with cvar.assign(1234567890): try: next(gen) except StopIteration: pass
Nick, Yury and I (and Nathaniel, Guido, Jim, ...?) somehow just narrowly missed the reasons for this in discussions related to PEP 550. Perhaps because we had mostly been looking at it from an async angle.
That's certainly a semantics that one can write down (and it's what the very first version of PEP 550 did),
I do remember Yury mentioning that the first draft of PEP 550 captured something when the generator function was called. I think I started reading the discussions after that had already been removed, so I don't know exactly what it was. But I doubt that it was exactly the above, because PEP 550 uses set and get operations instead of "assignment contexts" like PEP 555 (this one) does.
We didn't forget it, we just don't think it's very useful.
Yeah, I'm not surprised you remember that :). But while none of us saw a good enough reason for it at that moment, I have come to think we absolutely need it. We need both the forest and the trees.
Sure, if you think of next() as being a simple function call that does something that involves state, then you might want the other semantics (with PEP 555, that situation would look like):
def do_stuff_with_side_effects(): with cvar.assign(value): return next(global_gen_containing_state)
Now stuff happens within next(..), and whatever happens in next(..) is expected to see the cvar assignment.
However, probably much more often, one just thinks of next(..) as "getting the next value", although some computations happen under the hood that one doesn't need to care about.
As we all know, in the real world, the use case is usually just to generate the Fibonacci sequence ;). And when you call fibonacci(), the whole sequence should already be determined. You just evaluate the sequence lazily by calling next() each time you want a new number. It may not even be obvious when the computations are made:
fib_cache = [0, 1]
def fibonacci(): for i in itertools.counter(): if i < len(fib_cache): yield fib_cache[i] else:
# not calculated before new = sum(fib_cache[-2:]) fib_cache.append(new) yield new
# (function above is thread-unsafe, for clarity)
(Important:) So in any situation, where you want the outer context to
affect the stuff inside the generator through next(), like in the
do_stuff_with_side_effects example, the author of the generator function
needs to know about it. And then it is ok to require that the author uses
a decorator on the generator function.
But when you just generate a pre-determined set of numbers (like fibonacci), the implementation of the generator function should not matter, but if the outer context leaks in through next(..), the internal implementation does matters, and the abstraction is leaky. I don't want to give the leaky things by default.