[CentralOH] Generator and DRY
Jeff Rush
jeff at taupro.com
Wed Jul 23 02:26:26 CEST 2008
Mark Erbaugh wrote:
> Here are part of method I wrote that provides a generator:
>
> def read(self, id, size=None):
> i = gen_data.next()
> try:
> while True:
> while len(buffer) >= read_size:
> yield(''.join(buffer[:read_size]))
> del buffer[:read_size]
> i = gen_data.next()
> except StopIteration:
> pass
> while len(buffer) >= read_size:
> yield(''.join(buffer[:read_size]))
> del buffer[:read_size]
> if len(buffer):
> yield(''.join(buffer))
>
>
> The purpose of this code is to take data from another generator
> (gen_data) and deliver it in chunks of the specified size (except the
> last). Code not shown handles possible decompression (zlib) of the
> data.
>
> However, since the embedded function contains a yield() call, it now
> becomes a generator itself and the re-factored code doesn't do work
> properly.
>
> Is there a way to create an embedded function where the yield() call is
> still tied to the parent function/method?
I don't have a quick answer to your interesting problem of controlling the
"yield" space of nested functions (I'll research it) but in case it is useful
in some obsure way, here is a simple expression for chunking any generator:
for chunk in iter(lambda: f.read(block_size), ''):
...
It doesn't handle your decompression or careful use of memory for huge chunks,
but perhaps refactoring your code say around gen_data.next() into something
more like gen_data.read(chunk_size) and layering this with iter() might shed
light on another approach. Just a passing thought... I think iter() is cool.
-Jeff
More information about the CentralOH
mailing list