Speed up loading and free memory...
otto_kruse at hotmail.com
Thu Mar 11 11:30:04 CET 2004
> i need advices on this : having all files are text files, 1 is an index
> (6mo) to another big data file (20mb), i can not put that in a database and
> must work from raw files.
> i've tried to design an oo interface to theses files, well, it took 10
> minutes to just load and build the whole wrapper objects (really simple
> ones - with not the full logic behind) and it tooks 446mb of memoy! ('top'
> command executed while loading to check). if i use tuples instead of objects
> it still need about 2 minutes and 240mb...
> as i did not need all data in memory at a moment but only index and then
> follow to some data, please have you any clue on how i can handle this ?
> having to load only index tooks less than 1 minutes and only 60mb in memory
> which is acceptable, but if i read data directly from a disk it is
> desperatly slow...
> thanks a lot,
How do you access and read the .txt files? Could you give the commands
For example; if you use readlines() the whole txt file gets loaded in
memory, use readline() instead, this way lines are read one by one.
More information about the Python-list