[pypy-dev] Pypy garbage collection

Bengt Richter bokr at oz.net
Tue Mar 18 11:41:07 CET 2014

PMJI, but I wonder if some of these objects could come from
trivial re-instatiations instead of re-use of mutable
objects, e.g., fishing out one attribute to use together
with a new value as init values for an (unnecessarily) new obj.

     obj = ObjClass(obj.someattr, chgval)


     obj.chg = chgval

would have done the job without creating garbage. I suspect this
pattern can happen more subtly than above, especially if __new__
is defined to do something tricky with old instances.

Also, creating a new object can be a tempting way to feel sure about
its complete state, without having to write a custom (re)init method.
On 03/18/2014 10:37 AM Martin Koch wrote:
> Thanks, Carl.
> This bit of code certainly exhibits the surprising property that some runs
> unpredictably stall for a very long time. Further, it seems that this stall
> time can be made arbitrarily large by increasing the number of nodes
> generated (== more data in the old generation == more stuff to traverse if
> lots of garbage is generated and survives the young generation?). As a user
> of an incremental garbage collector, I would expect that there are pauses
> due to GC, but that these are predictable and small.
> I tried running
> PYPY_GC_NURSERY=2000M pypy ./mem.py 10000000
> but that seemed to have no effect.
> I'm looking forward to the results of the Software Transactional Memory,
> btw :)
> /Martin
> On Tue, Mar 18, 2014 at 9:47 AM, Carl Friedrich Bolz<cfbolz at gmx.de>  wrote:
>> On 17/03/14 20:04, Martin Koch wrote:
>>> Well, it would appear that we have the problem because we're generating
>>> a lot of garbage in the young generation, just like we're doing in the
>>> example we've been studying here.
>> No, I think it's because your generating a lot of garbage in the *old*
>> generation. Meaning objects which survive one minor collection but then
>> die.
>>> I'm unsure how we can avoid that in
>>> our real implementation. Can we force gc of the young generation? Either
>>> by gc.collect() or implcitly somehow (does the gc e.g. kick in across
>>> function calls?).
>> That would make matters worse, because increasing the frequency of
>> minor collects means *more* objects get moved to the old generation
>> (where they cause problems). So indeed, maybe in your case making the
>> new generation bigger might help. This can be done using
>> PYPY_GC_NURSERY, I think (nursery is the space reserved for young
>> objects). The risk is that minor collections become unreasonably slow.
>> Anyway, if the example code you gave us also shows the problem I think
>> we should eventually look into it. It's not really fair to say "but
>> you're allocating too much!" to explain why the GC takes a lot of time.
>> Cheers,
>> Carl Friedrich
>> _______________________________________________
>> pypy-dev mailing list
>> pypy-dev at python.org
>> https://mail.python.org/mailman/listinfo/pypy-dev
> _______________________________________________
> pypy-dev mailing list
> pypy-dev at python.org
> https://mail.python.org/mailman/listinfo/pypy-dev

More information about the pypy-dev mailing list