getting MemoryError with dicts; suspect memory fragmentation

Emin.shopper Martinian.shopper emin.shopper at gmail.com
Fri Jun 4 07:00:00 EDT 2010


On Thu, Jun 3, 2010 at 10:00 PM, dmtr <dchichkov at gmail.com> wrote:
> I'm still unconvinced that it is a memory fragmentation problem. It's
> very rare.

You could be right. I'm not an expert on python memory management. But
if it isn't memory fragmentation, then why is it that I can create
lists which use up 600 more MB but if I try to create a dict that uses
a couple more MB it dies? My guess is that python dicts want a
contiguous chunk of memory for their hash table. Is there a reason
that you think memroy fragmentation isn't the problem? What else could
it be?

> Can you give more concrete example that one can actually try to
> execute? Like:
>
> python -c "list([list([0]*xxx)+list([1]*xxx)+list([2]*xxx)
> +list([3]*xxx) for xxx in range(100000)])" &

Well the whole point is that this is a long running process which does
lots of allocation and deallocation which I think fragments the
memory. Consequently, I can't give a simple example like that.

Thanks,
-Emin



More information about the Python-list mailing list