reading file objects in chunks
martin at marcher.name
Mon Nov 12 17:47:29 CET 2007
I'm looking for something that will give me an iterator to a
file-(like)-object. I have large files with only a single line in it
that have fixed length fields like, record length is 26bytes, dataA is
10 bytes, dataB is 16 bytes.
Now when I made my parsing stuff but can't find anything that will let
me read those file efficiently (guess I'm just thinking too
complicated). I'd like to have something like:
f = file("datafile.dat", buffering=26)
for chunk in f.read_in_chunks():
f.iter() looked promising at first but somehow it doesn't do "the
right thing"(tm). also itertools doesn't quite seem to be what I want.
Maybe I just need coffee but right now I'm in the dark.
I'd really like something nicer than
chunksize = 26
f = file("datafile.dat", buffering=chunksize)
chunk = f.read(chunksize)
while len(chunk) == chunksize:
I just don't feel comfortable with it for some reason I can't explain...
More information about the Python-list