iterating over a generator while sending values

Hi, I often use generators when I want to bake some smarts into my producers in a typical producer and consumer pattern. Twice in the past year or so on different projects after a few iterations on my producer/consumer code ("iterations", ha!) I would eventually find I wanted my consumer to send some simple feedback to the producer - when using generators send works great for this! I normally start here: q = producer() def consume(q): for item in q: success = handle(item) But then ended up here: def consume(q): success = None while True: try: item = q.send(success) except StopIteration: break success = handle(item) But this always feels clunky; almost "out of order" - I've normally got some logging and other bits in there. I'm just trying to iterate over the queue, the StopIteration just feels like noise. It seems I always started out with a simple iteration and then end up refactoring in the send function once I realize I need to add something extra into my producer: I want to just say: def consume(q): for item in q: success = handle(item) q.send(success) # oh how I wish I could just add this here! But I can just "tack on" the send because it consumes too, so I have to rephrase the "simple" iteration into a "while True try except StopIteration break" - yuk. I normally just end up adding a little helper function, which I've named "sending" and use it like this: def consume(q): for item in sending(q, lambda: success): success = handle(item) ^ much closer to the original iterative code! Here's the implementation if you didn't already guess: # this function is missing from stdlib for some reason... def sending(g, sender): """ Iterate over g with send :params g: the iterable :params sender: the callable returning the value to send """ yield g.next() while True: yield g.send(sender()) This will raise StopIteration as soon as g is exhausted, but since the caller is already consuming "sending" with a for loop it all works ok in the end. Does anyone else have a different way of iterating over a generator while sending values? Does everyone just already have a similar function stuck in their bag? Does this happen to other folks frequently enough that a generalizable solution could be added to stdlib? Maybe it's too simple to be worth anyone's time or i'm missing something that already works better... Thanks! -clayg

On 3/27/2014 4:16 PM, Clay Gerrard wrote:
Hi,
I often use generators when I want to bake some smarts into my producers in a typical producer and consumer pattern. Twice in the past year or so on different projects after a few iterations on my producer/consumer code ("iterations", ha!) I would eventually find I wanted my consumer to send some simple feedback to the producer - when using generators send works great for this!
I normally start here:
q = producer()
def consume(q): for item in q: success = handle(item)
But then ended up here:
def consume(q): success = None while True: try: item = q.send(success) except StopIteration: break success = handle(item)
But this always feels clunky; almost "out of order" - I've normally got some logging and other bits in there. I'm just trying to iterate over the queue, the StopIteration just feels like noise.
For Python, the normal mode of iteration is 'pull' mode -- the consumer pulls from the producer. I believe the send method was mostly designed for inversion of control with 'push' mode -- the producer pushes to the consumer with .send. The consumer might yield feedback to the producer. Sending feedback in pull mode is a secondary use. The iterator protocol was designed for for statements: the initial call to iter(), multiple calls to next() and catching of StopIteration are all handled internally. Any time you want to do anything non-standard, other that stop before the iterator is exhausted, you have to deal with iter, next, and StopIteration yourself. StopIteration is an essential part of the protocol, not noise. An example: def min_f(iterable): it = iter(iterable) try: val = next(it) except StopIteration: raise ValueError("an empty iterable has no minimum") for ob in it: if ob < val: val = ob return val
It seems I always started out with a simple iteration and then end up refactoring in the send function once I realize I need to add something extra into my producer:
I want to just say:
def consume(q): for item in q: success = handle(item) q.send(success) # oh how I wish I could just add this here!
But I can just "tack on" the send because it consumes too,
I think you meant "can't", but the problem with .send is that is returns the next items prematurely in addition to receiving something. That is for the design reason given above. You could just 'tack on' if you wrote an iterator class with .__iter__, .__next__, and .send methods, where .send only received (and returned an ignorable None).
so I have to rephrase the "simple" iteration into a "while True try except StopIteration break" - yuk.
Catching StopIteration is part of explicit use of the iterator protocol, as in min_f above. In this case, one could make everything implicit again by using functools.reduce, with in its two parameter form, abstracts away the code pattern used in min_f. from functools import reduce def min_r(iterable): return reduce(lambda v, o: v if v <= o else o, iterable) s = [5,6,4,6,3] print(min_f(s), min_r(s))
3, 3
I normally just end up adding a little helper function, which I've named "sending" and use it like this:
def consume(q):
success = None
for item in sending(q, lambda: success): success = handle(item)
# this function is missing from stdlib for some reason...
Because it is short, special purpose, and pretty easy to write.
def sending(g, sender): """ Iterate over g with send
:params g: the iterable :params sender: the callable returning the value to send """ yield g.next() while True: yield g.send(sender())
Writing an iterator or generator that wrap an iterator or, as in this case, specifically a generator, is common. Some common patterns that work for iterators as input are included in itertools. More are given in the recipes section of the doc. The criterion for being in the module is that the abstraction either be common or useful in composition with others in the module. A specialized generator-input-only iterator would not fit. It is possible that a 'gentools' module would be useful, but it should start on PyPI.
This will raise StopIteration as soon as g is exhausted,
as per the protocol
but since the caller is already consuming "sending" with a for loop it all works ok in the end.
It does not matter whether the caller uses your generator in a for loop or explicit iter-while-next-except-StopIteration construct. -- Terry Jan Reedy

Clay Gerrard wrote:
def consume(q): success = None while True: try: item = q.send(success) except StopIteration: break success = handle(item)
My first thought would be to ask whether you really need to do things this way round. Instead of the consumer pulling things from the producer, can you have the producer push them to the consumer? Then the response is just a function return value. Assuming you do need to pull, my solution would be something like this (not tested): class Feedbackerator: def __init__(self, base): self.base = base self.response = None def __next__(self): return self.base.send(self.response) def consume(q): f = Feedbackerator(q) for item in f: f.response = handle(item) -- Greg

On 28 March 2014 09:17, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Clay Gerrard wrote:
def consume(q): success = None while True: try: item = q.send(success) except StopIteration: break success = handle(item)
Assuming you do need to pull, my solution would be something like this (not tested): class Feedbackerator: def __init__(self, base): self.base = base self.response = None def __next__(self): return self.base.send(self.response) def consume(q): f = Feedbackerator(q) for item in f: f.response = handle(item)
Ooo, nice. Minor tweaks to make it work in py2, and not require you to set the response every iteration: class Feedbackerator(object): def __init__(self, iterator): self.iter = iter(iterator) self.resp = None def next(self): r, self.resp = self.resp, None return self.iter.send(r) def __iter__(self): return self def consume(q): qi = Feedbackerator(q) for item in qi: res = handle(item) if needs_feedback(res): qi.resp = feedback_for(res) Cheers, aj -- Anthony Towns <aj@erisian.com.au>
participants (4)
-
Anthony Towns
-
Clay Gerrard
-
Greg Ewing
-
Terry Reedy