After seeing yet another person asking how to do this on #python (and having needed to do it in the past myself), I'm wondering why itertools doesn't have a function to break an iterator up into N-sized chunks.
Existing possible solutions include both the "clever" but somewhat unreadable...
batched_iter = zip(*[iter(input_iter)]*n)
...and the long-form...
def batch(input_iter, n):
input_iter = iter(input_iter)
while True:
yield [input_iter.next() for _ in range(n)]
There doesn't seem, however, to be one clear "right" way to do this. Every time I come up against this task, I go back to itertools expecting one of the grouping functions there to cover it, but they don't.
It seems like it would be a natural fit for itertools, and it would simplify things like processing of file formats that use a consistent number of lines per entry, et cetera.
~Amber