[Python-Dev] Re: [Python-checkins] CVS: python/dist/src/Modules gcmodule.c,2.9,2.10
Fri, 1 Sep 2000 09:48:21 -0600
On Fri, Sep 01, 2000 at 10:24:46AM -0400, Jeremy Hylton wrote:
> Even people who do have problems with cyclic garbage don't necessarily
> need a collection every 100 allocations. (Is my understanding of what
> the threshold measures correct?)
It collects every net threshold0 allocations. If you create and delete
1000 container objects in a loop then no collection would occur.
> But the difference in total memory consumption with the threshold at
> 100 vs. 1000 vs. 5000 is not all that noticable, a few MB.
The last time I did benchmarks with PyBench and pystone I found that the
difference between threshold0 = 100 and threshold0 = 0 (ie. infinity)
was small. Remember that the collector only counts container objects.
Creating a thousand dicts with lots of non-container objects inside of
them could easily cause an out of memory situation.
Because of the generational collection usually only threshold0 objects
are examined while collecting. Thus, setting threshold0 low has the
effect of quickly moving objects into the older generations. Collection
is quick because only a few objects are examined.
A portable way to find the total allocated memory would be nice.
Perhaps Vladimir's malloc will help us here. Alternatively we could
modify PyCore_MALLOC to keep track of it in a global variable. I think
collecting based on an increase in the total allocated memory would work
better. What do you think?
More benchmarks should be done too. Your compiler would probably be a
good candidate. I won't have time today but maybe tonight.