[Tutor] Re: memory problem (II part)
Guillermo Fernandez Castellanos
guillermo.fernandez at epfl.ch
Tue Jan 20 08:50:06 EST 2004
Hi,
hcohen2 wrote:
> Guille,
> I am perplexed, what product are you using and how heavily are you
pounding on it?
I'm using SQlite. I've a database with a table of about 150.000 (and
another of 300.000) where the table has 18 columns, some with only a
date, but another with a full log. The table will be simplified soon,
but still there will be a lot of information query.
>I see that the pysqlite code is using weak references to keep track of all
>the cursors associated with a connection:
>
> http://www.python.org/doc/lib/module-weakref.html
>
>so the cursors should get reclaimed eventually when the garbage collector
>kicks in.
>
>
Maybe there's a way of forcing the garbage collector to do his job...
I tried to put on the postcmd function of the cmd package the collector
gc.collect() but does not seem to help much.
>Instead of closing the connections, what happens if you call each cursor's
>close() method after you do a fetch? Doing a close() on the connection
>may be working because it calls close() on all open cursors. Let's verify
>that its the cursors that are holding onto database results in memory.
>
I also tried that, and close a cursor each time I open it (I mean...
after the query ;-) with no results neither. What seems to free memory
is to close the database.
Funny thing as well... memory is managed differently from windows to
linux. That way, in linux once we make a big request, we can do further
big requests the amount or RAM will never go beyond that point (i.e.
around 250Mb above the normal use of the system). Use of RAM in Windows
is different though, as it seems to be incremental. Each time I start a
new request, the equivalent amount of RAM is used. That means that it
starts taking 250, then 500, then... without attaining a maximum.
Thanks for your help,
Guille
More information about the Tutor
mailing list