[Python-ideas] Introduce collections.Reiterable

Steven D'Aprano steve at pearwood.info
Tue Sep 24 06:21:05 CEST 2013

On Tue, Sep 24, 2013 at 12:42:18PM +0900, Stephen J. Turnbull wrote:
> Steven D'Aprano writes:
>  > A lot of work for virtually no benefit. Besides, who said that infinite 
>  > iterators are common? 
> Infinite, no.  Don't know the length until you're done, common.

Which is why iterators don't require a length. Another way of spelling 
"length is unknown" is "object has no __len__".

> Length nondeterministic and in principle unbounded, common.

Maybe it's the mathematician in me speaking, but I don't think very many 
unbounded iterators are found outside of maths sequences. After all, 
even if you were to iterate over every atom in the universe, that would 
be bounded, and quite small compared to some of the numbers 
mathematicians deal with... :-)

>  > If you care about infinite iterators, you can add your own "isinfinite" 
>  > flag on them. Personally, I wouldn't bother. I just consider this a case 
>  > for programming by contract: unless the function you are calling 
>  > promises to be safe with infinite iterators, you should not use them.
> But finite iterators can cause problems too (eg, Nick's length=1google
> range -- even with an attosecond processor, that will take a while to
> exhaust :-).  It would be nice if a program could choose its own value
> of "too big", and process "large finite" and "infinite" lists in the
> same way by taking "as much as possible".

You can already do that, although it requires a bit of manual work 
and preperation. Within Python, you can use itertools.islice, and take 
slices of everything to limit the number of items processed:

process(islice(some_iterator, MAXIMUM))

Or you can use your operating system to manage resource limits, e.g. on 
Linux systems ulimit -v seems to work for me:

py> def big():
...     while True:
...             yield 1
py> list(big())
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>

It would be nice if Python allowed you to tune memory consumption within 
Python itself, but failing that, that's what the OS is for.

Mind you, I have repeatedly been bitten by accidently calling list() on 
a too large iterator. So I'm sympathetic to the view that this is a hard 
problem to solve and Python should help solve it.


More information about the Python-ideas mailing list