PEP 289: Generator Expressions (please comment)

Alex Martelli aleax at aleax.it
Mon Nov 3 03:15:37 EST 2003


kirby urner wrote:

> So list comprehensions don't just produce iterables, they produce
> indexables, i.e. [func(j) for j in range(n)][10] makes sense (if
> n>10).

list comprehensions produce lists.  They are indexable, sliceable,
and everything else that lists always are, yes.


> So presumably a list generator, as a kind of "lazy list comprehension"

There's no such thing (neither existing nor proposed) as a "list
generator".  There are (going to be) "generator expressions", which
will produce iterators, not lists.  They will not be indexable, etc.

> would be forced to iterate up to whatever index was called on it (via
> __getitem__), in order to return (genexpr)[n].  Of course simple
> generators don't implement __getitem__.

Neither will the results of generator expressions, which will in fact
also be simple generators.

> So what happens to members of the list that have been evaluated?
> Cached somehow?

Generator expressions will behave exactly as an equivalent generator
would.  No caching, no indexability, etc.


> I'm wondering if there's any confusion about how a generator *inside*
> a list generator would be evaluated i.e. right now it makes sense to
> go [r for r in gen(r)] if gen(r) has a stop condition -- the
> comprehension syntax will force gen(r) to the end of its run.  Lazy

Yes, whatever kind of iterator gen(r) returns, it will be exhausted
by this notation just as it would be by e.g. list(gen(r)) [which
happens to be exactly equivalent to the list comprehension you write].

> evaluation would suggest generators with no terminus might be enclosed
> e.g. d=(p for p in allprimes()).  After which, d[100] would return a

d is not going to be indexable.  I would be interested in understanding
what part of the PEP gave you these misconceptions regarding what we
are proposing, so we can fix the PEP and the future docs.

> hundredth prime (d[0] --> 2).  So d[50] would now be retained in
> memory somewhere, but d[1000] would trigger further iterations?  And
> we can do slices too?

d = (p for p in allprimes())

will produce results exactly equivalent to

d = iter(allprimes())

[type(d) may happen to be different in the two cases, but d's
capabilities will be essentially identical].  More generally, e.g:

d = (f(p) for p in X if g(p))

will have the same semantics as:

def _aux():
    for p in X:
        if g(p):
            yield f(p)
d = _aux()
del _aux

no more, no less.


Alex





More information about the Python-list mailing list