[Tutor] reading very large files

Vikram K kpguy1975 at gmail.com
Tue May 17 19:20:42 CEST 2011


I wish to read a large data file (file size is around 1.8 MB) and manipulate
the data in this file. Just reading and writing the first 500 lines of this
file is causing a problem. I wrote:

fin = open('gene-GS00471-DNA_B01_
1101_37-ASM.tsv')
count = 0
for i in fin.readlines():
    print i
    count += 1
    if count >= 500:
        break

and got this error msg:

Traceback (most recent call last):
  File
"H:\genome_4_omics_study\GS000003696-DID\GS00471-DNA_B01_1101_37-ASM\GS00471-DNA_B01\ASM\gene-GS00471-DNA_B01_1101_37-ASM.tsv\test.py",
line 3, in <module>
    for i in fin.readlines():
MemoryError

-------
is there a way to stop python from slurping all the  file contents at once?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/tutor/attachments/20110517/ae7d3fed/attachment.html>


More information about the Tutor mailing list