Upper memory limit

Kragen Sitaker kragen at pobox.com
Wed May 15 18:09:14 EDT 2002


Siegfried Gonzi <siegfried.gonzi at kfunigraz.ac.at> writes:
> I have a calculation (external C function included with the help of
> SWIG; external Fortran function included with the help of F2PY; some pre
> processed Lisp files; a dozen of binary satellite data files). One run
> of the code takes about 1 hour. The execution time is not the problem!
> Memory is not a problem too, but: it consumes after 1 hour 200 MB of
> RAM. If I start the calculation in the DOS shell I can see that Python
> gives memory back to the OS (I got the advice to start the calculation
> from the DOS line), but this is only for 2 runs the case. If I start the
> calculation for a third time I have to see that Python wreaks havoc.

I have had a somewhat similar problem.  If I create a dict containing
lots of small Numeric arrays, write the small arrays to a file, delete
the dict, and load the file as a large Numeric array, I get a
MemoryError.  If I exit and restart Python between deleting the dict
and loading the file, it works fine.

My hypothesis was that Python's allocator (well, somebody's
allocator!)  was leaving free memory fragmented, so that there was no
large contiguous block available after the creation and deletion of
the large dict.  This could be because little bits of stuff created
when the dict was created were left allocated, but I couldn't figure
out what they'd be, or it could be because adjacent free blocks
weren't being coalesced.

> b) Is there semantically speaking a difference between automatic memory
> managament and garbage collection?

No.

> c) Dose Python's garbage collector just simply collect and dispose all
> the memory junks at the end of the calculation (and hence once) or does
> Python collect garbage and at the same time removing old memory?

When an object's reference count drops to zero, Python deallocates it
using (normally) the system allocator.





More information about the Python-list mailing list