reading file objects in chunks
Martin Marcher
martin at marcher.name
Mon Nov 12 11:47:29 EST 2007
Hi,
I'm looking for something that will give me an iterator to a
file-(like)-object. I have large files with only a single line in it
that have fixed length fields like, record length is 26bytes, dataA is
10 bytes, dataB is 16 bytes.
Now when I made my parsing stuff but can't find anything that will let
me read those file efficiently (guess I'm just thinking too
complicated). I'd like to have something like:
f = file("datafile.dat", buffering=26)
for chunk in f.read_in_chunks():
compute_data(chunk)
f.iter() looked promising at first but somehow it doesn't do "the
right thing"(tm). also itertools doesn't quite seem to be what I want.
Maybe I just need coffee but right now I'm in the dark.
I'd really like something nicer than
chunksize = 26
f = file("datafile.dat", buffering=chunksize)
chunk = f.read(chunksize)
while len(chunk) == chunksize:
compute_data(chunk)
f.read(chunksize)
I just don't feel comfortable with it for some reason I can't explain...
thanks
martin
--
http://noneisyours.marcher.name
http://feeds.feedburner.com/NoneIsYours
More information about the Python-list
mailing list