something Evil happens when large hashes destroyed

guuge guuge at localhost.localhost
Mon Nov 19 16:23:18 CET 2001

On Sun, 18 Nov 2001 11:37:19 -0600, Skip Montanaro <skip at> wrote:
>     guuge> I was trying to sort about 100,000 items by splitting them into
>     guuge> groups (using a python dictionary type) and then successively
>     guuge> splitting these groups until they were small enough to use a
>     guuge> brute force method.
>     guuge> My test program, which used only two or three keys to create the
>     guuge> first split, worked fine.  When I tried the real thing, using 256
>     guuge> keys, the program slowed to a crawl.  The python interpreter took
>     guuge> forever to destroy the dictionaries.
> I believe this topic has been discussed recently.  When a dictionary is
> deleted, the keys are traversed in their "natural order", that is, as they
> are laid out in the dict.  However, the values stored in the dict are
> scattered all over the place, so lots of small chunks of memory are freed.
> Your underlying malloc library is probably spending lots of time trying to
> coalesce small chunks of freed memory into bigger chunks.
> One workaround seems to be to compile Python with pymalloc enabled.  

Pymalloc did the trick :)
All destroy times are now a few seconds.  Thanks.

> Another is to use sys._exit to exit your program (assuming the storage 
> for your big hashes are getting reclaimed at program exit).
> On the other hand, are you sure you're not just running out of memory?

140 megabytes
The program never used more than about 20 or 30.

> -- 
> Skip Montanaro (skip at -

More information about the Python-list mailing list