[Python-ideas] Possible PEP 380 tweak
Guido van Rossum
guido at python.org
Mon Oct 25 22:21:07 CEST 2010
On Mon, Oct 25, 2010 at 12:53 PM, Ron Adam <rrr at ronadam.com> wrote:
> This is how my mind wants to write this.
> def reduce_collector(func):
> value = yield # No value to yield here.
> while True:
> value = func((yield), value) # or here.
> except YieldError:
IIUC this works today if you substitute GeneratorExit and use
c.close() instead of next(c) below. (I don't recall why I split it out
into two different try/except blocks but it doesn't seem necessary.
As for being able to distinguish next(c) from c.send(None), that's a
few language revisions too late. Perhaps more to the point, I don't
like that idea; it breaks the general treatment of things that return
None and throwing away values. (Long, long, long ago there were
situations where Python balked when you threw away a non-None value.
The feature was boohed off the island and it's better this way.)
> # next was called not send.
> yield value
I object to overloading yield for both a *resumable* operation and
returning a (final) value; that's why PEP 380 will let you write
"return value". (Many alternatives were considered but we always come
back to the simple "return value".)
> def parallel_reduce(iterable, funcs):
> collectors = [reduce_collector(func) for func in funcs]
> for v in iterable:
> for coll in collectors:
> return [next(c) for c in collectors]
I really object to using next() for both getting the return value and
the next yielded value. Jacob's proposal to spell this as c.close()
sounds much better to me.
> It nicely separates input and output parts of a co-function, which can be
> tricky to get right when you have to receive and send at the same yield.
I don't think there was a problem with this in my code (or if there
was you didn't solve it).
> Maybe in Python 4k? Oh well. :-)
>> The interesting thing is that I've been dealing with generators used
>> as coroutines or tasks intensely on and off since July, and I haven't
>> had a single need for any of the three patterns that this example
>> happened to demonstrate:
>> - the need to "prime" the generator in a separate step
> Having a consumer decorator would be good.
> def consumer(f):
> def wrapper(*args, **kwds):
> coroutine = f(*args, **kwds)
> return coroutine
> return wrapper
This was proposed during the PEP 380 discussions. I still don't like
it because I can easily imagine situations where sending an initial
None falls totally naturally out of the sending logic (as it does for
my async tasks use case), and it would be a shame if the generator's
declaration prevented this.
> Or maybe it would be possible for python to autostart a generator if it's
> sent a value before it's started? Currently you get an almost useless
> TypeError. The reason it's almost useless is unless you are testing for it
> right after you create the generator, you can't (easily) be sure it's not
> from someplace inside the generator.
I'd be okay with this raising a different exception (though for
compatibility it would have to subclass TypeError). I'd also be okay
with having a property on generator objects that let you inspect the
state. There should really be three states: not yet started, started,
finished -- and of course "started and currently executing" but that
one is already exposed via g.gi_running.
Changing the behavior on .send(val) doesn't strike me as a good idea,
because the caller would be missing the first value yielded!
IOW I want to support this use case but not make it the central
driving use case for the API design.
--Guido van Rossum (python.org/~guido)
More information about the Python-ideas