Speed up loading and free memory...

benjamin schollnick bscholln at mac.com
Thu Mar 11 07:25:43 EST 2004


> i need advices on this : having all files are text files, 1 is an index
> (6mo) to another big data file (20mb), i can not put that in a database and
> must work from raw files.
> 
> i've tried to design an oo interface to theses files, well, it took 10
> minutes to just load and build the whole wrapper objects (really simple
> ones - with not the full logic behind) and it tooks 446mb of memoy! ('top'
> command executed while loading to check). if i use tuples instead of objects
> it still need about 2 minutes and 240mb...
> 
> as i did not need all data in memory at a moment but only index and then
> follow to some data, please have you any clue on how i can handle this ?
> having to load only index tooks less than 1 minutes and only 60mb in memory
> which is acceptable, but if i read data directly from a disk it is
> desperatly slow...

Your probably not going to like this...

But there is only three ways to deal with something like this:

1) Read everything into memory
2) Read only as much as needed, and load the rest from disk as needed.
3) Store the data in a proper database

Now I've seen this solved, I admit, this was with a different language,
and feels like nearly a life time ago.....

But, your problem is not unsolvable...  But, it would help if we had
more details on the index file, and the main data file...

Maybe a different method of handling this would solve your issue.

For example, does the data all have to be stored in a single file?
Would using multiple indexs, and multiple storage files improve your
performance?

But those are question I can not answer...

      - Benjamin



More information about the Python-list mailing list