[Twisted-Python] Design Pattern for Iterating Through Lists
I've come almost to the point of making a template for this kind of operation: d = Deferred() d.callback(0) results = [] def doItem(void, eggs): return whatever(eggs) def processResult(result): results.append(result) for spam in list: d.addCallback(doItem, spam['eggs']) d.addCallback(processResult) d.addCallback(lambda _: results) return d The reason I'm calling back on doItem is that lambda will evaluate its variables one for the iteration, causing only the first evaluation of spam['eggs'] to be passed for each item in the list. Is there a more readable/efficient way of doing this? Note that I'm not using a DeferredList because I want everything processed serially, not in parallel. -Ken
On Mon, 14 Mar 2005 10:37:15 -0600, Ken Kinder <ken@kenkinder.com> wrote:
I've come almost to the point of making a template for this kind of operation:
d = Deferred() d.callback(0)
results = []
def doItem(void, eggs): return whatever(eggs)
def processResult(result): results.append(result)
for spam in list: d.addCallback(doItem, spam['eggs']) d.addCallback(processResult)
d.addCallback(lambda _: results) return d
The reason I'm calling back on doItem is that lambda will evaluate its variables one for the iteration, causing only the first evaluation of spam['eggs'] to be passed for each item in the list. Is there a more readable/efficient way of doing this? Note that I'm not using a DeferredList because I want everything processed serially, not in parallel.
Actually, I think you meant "last evaluation" rather than first. One solution is just to wrap the above into a function: def serially(processor, items): results = [] d = defer.Deferred() for elem in items: d.addCallback(lambda ign, elem=elem: processor(elem)) d.addCallback(results.append) d.addCallback(lambda ign: results) d.callback(None) return d Now you can call serially whenever you need this operation and forget how unpleasant the implementation is :) Another approach involves building up the Deferred chain gradually rather than all at once: def serially(processor, items): results = [] d = defer.Deferred() toBeProcessed = iter(items) def doItem(ignored): for elem in toBeProcessed: defer.maybeDeferred( processor, elem).addCallback( results.append).addCallback( doItem) break else: d.callback(results) return d It's a bit more code, and I'm still not sure how I feel about that abuse of for loops (but is handling StopIteration manually any better? dunno), but it avoids building up a large stack inside the Deferred, which can be a good thing - a large number of synchronous results will probably cause Python to raise a RuntimeError in your version or my first version, since Deferred processes its callbacks recursively. Hope this helps, Jp
participants (2)
-
Jp Calderone
-
Ken Kinder