On Thu, 2004-10-28 at 04:09, Gregory P. Smith wrote:
That said, here's a workaround for avoiding permanent huge memory consumption in known workloads:
fork() before doing the part that consumes a ton of memory. afterwards return the results, post huge memory consumption, via pipe to the waiting parent process and exit the child so the parent can continue on not consuming 700mb.
Right: This certainly is an effective workaround, but only for very specific, memory consuming tasks. Python should be better than to inflict this kind of hack on programmers. We shouldn't have to worry about an implementation detail of the Python interpreter. I don't believe that Jython has this problem, but it has been a while since I looked at it, so I could be wrong.
I'm still planning on working on this issue: I just need to find the time.