O(n^2) is bad - can it be fixed?
Courageous
jkraska1 at san.rr.com
Fri May 25 12:09:36 EDT 2001
> because millions of lists with a few things are ...
The use of the word "millions" is hyperbolous, yes? I just can't
imagine that adding a 32 bit number to every list in Python is all
that big a deal. Do you have actual numbers here?
>We could, e.g., always round the list size up to the next higher power of 2.
>That way the capacity is implicit. We're unlikely to, though.
Is this a memory usage issue?
>Except that it rarely happens at all -- don't overblow this. About once each
>16 months *somebody* gets excited about this, then loses interest as they
>discover it never hurts much except in the artificial cases they've
>constructed to provoke it.
Indeed.
> If building
>mega-element lists via one append at a time is too slow for your app, fine,
>don't do that.
And there are alternatives; I can't be the only one to have written a list
replacement.
>As above, not a chance.
Fine, but I'm having a hard time believing that it's that big a problem. The
other day I modded my Python distro to do postmortem's on all Python
dictionaries after a run of the pystone suite; this forced me to keep them
from ever being destroyed so I could check their collision statistics and
the like (they were very good). Anyway, the point: there really weren't _that_
many dictionaries in use, and this includes all the ones used internally in
Python proper. I can't imagine that there'd be all that many lists in use in
a typical Python run?
>Also not a chance, but it becomes possible if we enable Vladimir's PyMalloc
>and fiddle it accordingly.
Sure.
C//
More information about the Python-list
mailing list