Recursive generator
Ben C
spamspam at spam.eggs
Wed Feb 13 14:15:52 EST 2008
On 2008-02-12, Paul Hankin <paul.hankin at gmail.com> wrote:
> On Feb 12, 10:17 pm, Ben C <spams... at spam.eggs> wrote:
>> On 2008-02-12, Paul Rubin <> wrote:
>>
>> > Paul Hankin <paul.han... at gmail.com> writes:
>> >> def genDescendants(self):
>> >> return chain([self], *[child.genDescendants()
>> >> for child in self.children])
>>
>> > That is scary. It generates an in-memory list the size of the
>> > whole subtree, at every level. Total memory consumption is maybe
>> > even quadratic, depending on the tree shape, but even if it's
>> > only linear, it's way ugly.
>>
>> This would probably be better (in terms of memory if not beauty):
>>
>> def genDescendants(self):
>> return chain([self], *(child.genDescendants()
>> for child in self.children))
>
> No, it's the same I think.
Yes I think you're right. The problem is really the *. The generator
comprehension (child.genDescendants() ...) will have to be run to
completion to build the arguments to pass to chain. So no laziness
there.
> forgot that a generator using yield isn't the same as the same
> function that returns an equivalent generator, because one is lazy and
> the other isn't. To get the lazy behaviour back, you'd have to
> write....
>
> def genDescendants(self):
> for g in chain([self], *(child.genDescendants()
> for child in self.children)):
> yield g
Is that lazy? It still seems like (child.genDescendants() ...) would
have to be run all the way to the end before chain could be called.
Unless * is lazy. I don't think it is, but I don't know for sure.
More information about the Python-list
mailing list