generator slides review

andrea crotti andrea.crotti.0 at gmail.com
Mon Feb 3 23:22:14 CET 2014


2014-02-03 Terry Reedy <tjreedy at udel.edu>:
> On 2/2/2014 5:40 AM, andrea crotti wrote:
>>
> In general, use assert (== AssertionError) to check program logic (should
> never raise). Remember that assert can be optimized away. Use other
> exceptions to check user behavior. So I believe that ValueError is
> appropriate here. I think I also questioned the particular check.
>

Yes that's right thanks fixed it

>
> 'Generator functions', which you labeled 'generators', are functions, not
> iterators. The generators they return (and the generators that generator
> expressions evaluate to) are iterators, and more.
>
>>>> type(a for a in 'abc')
> <class 'generator'>
>
> I am not sure whether 'specialized' or 'generalized' is the better term.

Well it's true that they are functions, but they also behaved
differently than functions.
I've read there has been some debate at that time whether to create a
different keyword to define generators or not, in the end they didn't
but it wouldn't be wrong imho..


>
>> This was mainly to explain how something like
>> for el in [1, 2, 3]:
>>      print(el)
>>
>> can work,
>
>
> But it is no longer has that *does* work. All the builtin xyz collection
> classes have a corresponding xyz_iterator class with a __next__ method that
> knows how to sequentially access collection items. We do not normally see or
> think about them, but they are there working for us every time we do 'for
> item in xyz_instance:'
>
>>>> [].__iter__()
> <list_iterator object at 0x00000000035096A0>
>
> In Python one could write the following:
>
> class list_iterator:
>   def __init__(self, baselist):
>     self.baselist = baselist
>     self.index = -1  # see __next__ for why
>
>   def __iter__(self):
>     return self
>   def __next__(self):
>     self.index += 1
>     return self.baselist[self.index]

Yes maybe that's a much better example to show thank you.


>>
>> Yes this is intentionally buggy. The thing is that I wanted to show
>> that sometimes generating things makes it harder to debug, and delays
>> some errors, which are anyway there but would come up immediately in
>> case of a list creation.
>> I could not find a better non artificial example for this, any
>> suggestion is welcome..
>
>
> slide 1
> ---------
> def recip_list(start, stop):
>   lis []
>   for i range(start, stop):
>     list.append(1/i)
>   return lis
>
> for x in recip_list(-100, 3):  # fail here
>   print x
>
> <immediate traceback that include the for line>
>
> slide 2
> -------
> def recip_gen(start, stop):
>   for i in range(start, stop):
>     yield 1/i
>
>
> for x in recip_gen(-100, 3):
>   print x  # fail here after printing 100 lines
> ...
> <delayed traceback that omits for line with args that caused problem>
>

That's already better, another thing which I just thought about could
be this (which actually happened a few times):

def original_gen():
    count = 0
    while count < 10:
        yield count
        count += 1


def consumer():
    gen = original_gen()
    # lis = list(gen)
    for n in gen:
        print(n * 2)

if I uncomment the line with "lis = list(gen)"
it won't print anything anymore, because we have to make sure we only
loop over ONCE.
That maybe is a better example of possible drawback? (well maybe not a
drawback but a potential common mistake)



More information about the Python-list mailing list