[Python-ideas] Possible PEP 380 tweak
Guido van Rossum
guido at python.org
Thu Oct 28 01:46:30 CEST 2010
Nick & Jacob,
Unfortunately other things are in need of my attention and I am
quickly lagging behind on this thread.
I'll try to respond to some issues without specific quoting.
If GeneratorReturn and finish() can be implemented in pure user code,
then I think it should be up to every (framework) developer to provide
their own API, using whatever constraints they chose. Without specific
use cases it's hard to reason about API design. Still, I think it is
reasonable to offer some basic behavior on the generator object, and I
still think that the best compromise here is to let g.close() extract
the return value from StopIteration if it catches it. If a framework
decides not to use this, fine. For a user working without a framework
this is still just a little nicer than having to figure out the
required logic yourself.
I am aware of four relevant states for generators. Here's how they
work (in current Python):
- initial state: execution is poised at the top of the function.
g.throw() always bounces back the exception. g.close() moves it to the
final state. g.next() starts it running. g.send() requires a None
argument and is then the same as g.next().
- running state: the frame is active. none of g.next(), g.send(),
g.throw() or g.close() work -- they all raise ValueError.
- suspended state: execution is suspended at a yield. g.close() raises
GeneratorExit and if the generator catches this it can do whatever it
pleases. If it then raises StopIteration or GeneratorExit, g.close()
is happy, if it raises another exception g.close() just passes that
through, if it yields a value g.close() complains and raises
RuntimeError().
- finished (exhausted) state: the generator has returned. g.close()
always return None. g.throw() always bounces back the exception.
g.next() and g.send() always raise StopIteration.
I would be in favor of adding an introspection API to distinguish
these four states and I think it would be a fine thing to add to
Python 3.2 if anyone finds the time to produce a patch (Nick? You
showed what these boil down to.)
I note that in the initial state a generator has no choice in how to
respond because it hasnt't yet had the opportunity to set up a
try/except, so in this state it acts pretty much the same as in the
exhausted state when receiving a throw() or close().
Regarding built-in syntax for Go-like channels, let's first see an
implementation in userland become successful *or* see that it's
impossible to write an efficient one before adding more to the
language.
Note that having a different expansion of a for-loop based on the
run-time value or type of the iterable cannot be done -- the expansion
can only vary based on the syntactic form.
There are a few different conventions for using generators and
yield-from; e.g. generators used as proper iterators with easy
refactoring; generators used as tasks where yield X is used for
blocking I/O operations; and generators used as "inverse generators"
as in the parallel_reduce() example that initiated this thread. I
don't particularly care about what kind of errors you get if a
generator written for one convention is accidentally used by another
convention, as long as it is made clear which convention is being used
in each case. Frameworks/libraries can and probably should develop
decorators to mark up the 2nd and 3rd conventions, but I don't think
the *language* needs to go out of its way to enforce proper usage.
--
--Guido van Rossum (python.org/~guido)
More information about the Python-ideas
mailing list