PEP 289: Generator Expressions (please comment)
urnerk at qwest.net
Mon Nov 3 08:49:12 CET 2003
So list comprehensions don't just produce iterables, they produce
indexables, i.e. [func(j) for j in range(n)] makes sense (if
So presumably a list generator, as a kind of "lazy list comprehension"
would be forced to iterate up to whatever index was called on it (via
__getitem__), in order to return (genexpr)[n]. Of course simple
generators don't implement __getitem__.
So what happens to members of the list that have been evaluated?
I'm wondering if there's any confusion about how a generator *inside*
a list generator would be evaluated i.e. right now it makes sense to
go [r for r in gen(r)] if gen(r) has a stop condition -- the
comprehension syntax will force gen(r) to the end of its run. Lazy
evaluation would suggest generators with no terminus might be enclosed
e.g. d=(p for p in allprimes()). After which, d would return a
hundredth prime (d --> 2). So d would now be retained in
memory somewhere, but d would trigger further iterations? And
we can do slices too?
More information about the Python-list