[Python-Dev] Graph exploration with generators
Shane Holloway (IEEE)
shane.holloway at ieee.org
Mon Aug 18 22:44:09 EDT 2003
Well, I've been holding this generators idea for after the Python 2.3 release since it would be a Python 2. or later
feature. That and I didn't want to delay the release by even a the time to read an email. ;)
Over the last few months, I've been using generators more and more to do lazy graph exploration. As a concrete example,
take the following binary tree type structure:
class Tree(object):
def __init__(self, left=(), right=()):
self.left = left
self.right = right
def __iter__(self):
for each in self.left:
yield each
yield self
for each in self.right:
yield each
This works great for iterating over ad-hoc graphs, and since "left" and "right" simply have to be iterable, you can
chain trees to graphs, lists, or anything iterable. However, for each "chained" generator quite a bit of speed is lost.
I assume this loss is due to restoring each context in the chain. (I would need help verifying this assumption.)
What I would like to propose is a "yield *" type statement, where "yield *collection" would have the equivalent
functionality of "for item in collection: yield item". The above example would then appear as follows:
class TreeWithNewSyntax(object):
def __init__(self, left=(), right=()):
self.left = left
self.right = right
def __iter__(self):
yield *self.left
yield self
yield *self.right
The syntax in particular might be an abuse of the call(*args) metaphor, but it avoids a new language keyword and has the
same general idea. By allowing this syntax, the need for restoring the context of (count-1) chains would circumvent for
a chained generator solution. I admit that this is a pretty specific optimization area, but generators are a very
natural way to express this type of problem in code. An implementation would speed these type of deeply nested
generator solutions.
I look forward to all your thoughts!
Thanks,
-Shane Holloway
More information about the Python-Dev
mailing list