PEP 288: Generator Attributes

Bengt Richter bokr at oz.net
Wed Dec 4 13:59:50 EST 2002


On Wed, 04 Dec 2002 08:41:39 GMT, "Raymond Hettinger" <vze4rx4y at verizon.net> wrote:

>[Bengt Richter]
>> I'd rather see generators extended in a backwards compatible way to allow
>you to
>> write the minimal example thus:
>>
>>         def mygen(data):
>>                while True:
>>                    print data
>>                    yield None
>>
>>         g = mygen.iter()       # mygen.__class__.iter() gets called with
>the function instance
>>         g(1)                   # prints 1 (arg name data is rebound to 1,
>then g.next() effect is done)
>>         g(2)                   # prints 2 (similarly)
>
>So, the behavior of mygen() is radically different depending on whether it
>is called with iter(mygen()) or mygen.iter().    It looks like the parameter
>string is being asked to pull double duty.  How would you code an equivalent

I'm not sure how you mean "double duty," but yes backwards compatibility would
force a radical difference. (Hm, since it is in the __future__, would it be
technically possible to drop the old way altogether?)

>to:
>
>def logger(afile, author):
>    while True:
>        print >> file, __self__.data, ':changed by', author
>        yield None

well, that's a matter of choosing how you want to pass the data
you are now passing via attribute manipulations separate from calls.
I'd probably just pass it as plain parameter with the others as
optional parameters. Remember, I am proposing to (re)bind the
parameter names each time the generator is called with parameters,
just before resuming execution on the line after the last yield.

def logger(data, file=None, author=None):
    while data:
        print >> file, data, ':changed by', author
        yield None

g = logger.iter()
g(some_data, file('log_file.txt','a'), 'Author Name')   # actual param bindings replace opt params
g(other_data)                                           # ditto, but opt param bindings are undisturbed
...
g(None) # trigger end -- just an example of one possible way of doing it

>
>
>
>> but also allowing something with varying arguments, e.g.,
>>
>>     def mygen(*args, **kwds):
>
>Lesson 1 from a PEP author:  the more you include in
>a proposal, the less likely you are to get any of it.
>
Even if all the pieces are required to make the whole work nicely?

>
>> I.e., in addition to having a .next() method for backwards compatibility,
>> the generator object would have a def __call__(self, *args, **kwargs)
>method,
>> to make the generator look like a callable proxy for the function.
>
>Lesson 2:  the more clever  the proposal, the more likely
>it is to be deemed unpythonic.
It's not meant to be "clever" for its own sake. But if g.next(params) is
easier to accept than g(params), I guess I can live with it ;-)
However, there is the compatibility issue, and I suspect it would be easier
to leave g.next() alone and parameterless, and to get the effects of the
first weird function call (which apparently currently binds parameters for
subsequent use and returns the generator object) separately. I.e.,
"g = foo.iter(); g(parameters)" does the same parameter binding and execution
as the old "g=foo(parameters); g.next()". If you then continue with another
g.next(), you resume after the last yield and can refer to the parameter name
locals as they were when that last yield was executed. This would be the same
either way. But if instead of g.next() you did g(new_parameters), the new parameters
would be received by g.__call__ and before resuming foo at the line after the
last yield, the generator would reach into the frame state and rebind the parameter
locals as if g(new_parameters) were a foo(new_parameters) call. Then the resume
would be effected, just like g.next().

>
>
>> When the generator's __call__ method was called, it would rebind the
>associated
>> function's arg locals each time, as if calling with arg unpacking like
>mygen(*args, **kwargs)
>> -- but then resuming like .next()
>
>See Lesson 1.
But this is part of making it all hang together.
>
>>
>> Possibly g() could be treated as another spelling of g.next(), skipping
>arg name rebinding.
>
>See Lesson 2
Ok, that is an unnecessay trick. All parameters can be made optional to allow g(),
just like an ordinary function.

>
>> A default argument should be allowable, and def mygen(x, y=123):
>
>See Lesson 1
Also part of making it all work, and if the parameter name binding is
done like for an ordinary function call, combined with the persistence
of locals in the generator state, it should just naturally follow.
>
>
>.> .. would have
>> x and y rebound as you would expect on the first call, at least. You could
>argue
>> whether to use the original default for subsequent calls or allow a
>rebinding
>> of a default arg name to act like a default for a subsequent call, but I
>think
>> optional args should remain optional for all calls.
>
>See Lesson 2 ;)
Again, I think it would just follow from doing the param rebinding in a
natural way. I'm thinking rebound optional parameters should persist in
the generator state and act like they would as default values in the first
call. Also a part of making the whole thing useful.

>
>
>>
>> Note that a generator could now be also used with map, reduce, and filter
>as functions
>> as well as sequence sources. E.g., IWT you could write compile as a string
>filter.
>>
>> For backwards compatibility. mygen.iter() would have to set a flag or
>whatever so
>> that the first g() call would not call mygen and get a generator, but
>would just
>> bind the mygen args and "resume" like .next() from the very start instead.
>
>I'm sure there's another lesson here too ;)
>
Orthogonality breeds? ;-)

>
>> Calls to g.next() could be intermixed and would just bypass the rebinding
>of the
>> mygen arg names.
>
>Really?
The way I'm thinking of it, yes. But not if g.next() takes parameters and
is used in place of g.__call__. But if you used g.next() after g=foo.iter()
without either having default parameters for foo or a g(some_params) call
to make bindings, you would have a possible use-before-set error.

>
>
>> I.e., maybe .iter() could be generalized as a factory method of the
>function class/type,
>> which might mean that you could set up to capture the frame of an ordinary
>function
>> and allow yields from nested calls -- which would open more multitasking
>possibilities
>> using generators.
>
>Yes!
>See my cookbook recipe for an example.
>It implements most of the PEP 288 in pure python using a factory function.
>http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/164044
Interesting. But I want the generator to be a controlling wrapper for
a function and its potential tree of nested calls rather than conceiving
of a particular function as transmogrified by presence of yield, and all
the rest of current limitations.

I see your recipe as a clever workaround for (some) current generator shortcomings,
whereas I'd like to dream of a better generator. See PEP 667, right? ;-/

>
>
>> I.e., yield would look down the stack until it found the first call of a
>__call__ of
>> a generator, and apparently return from that, while the generator kept the
>whole
>> stack state from the yield down to itself for continuing when called
>again.
>>
>> Also, it seems like this would make a generator *relate to and control* an
>associated
>> normal function instead of *being* a weirded-up function. And yield would
>be decoupled
>> enough that you could probably write exec 'yield' and have it work from a
>dynamically
>> determined place, so long as there was the frame from a call to a
>generator's __call__
>> somewhere on the stack.
>
>See PEP 667, proposed mandatory drug testing for pythonistas ;)
Which aspect strikes you that way, the technical or the logistic?

Regards,
Bengt Richter



More information about the Python-list mailing list