O(n^2) is bad - can it be fixed?

Andrew Dalke dalke at acm.org
Sat May 26 09:19:12 CEST 2001


Isaac To Kar Keung wrote:
>I see this as a fundamental design problem: it ties the conceptual list to
>physical memory allocation.  The user should have their own right to say
>that the list is to step between 397 and 398 elements (or whatever number
>they are in mind), and the performance of the library shouldn't be in
>completely different orders (e.g., from constant time, linear space to
>linear time, quadratic space) in such cases.  This would make Python
>completely useless.

Since Python does this and people use it, that means Python cannot
be "completely useless."

It also isn't a fundamental design problem.  If you want to create
your own list which  supports scaling growth, you can write your
own list-like class, even in Python.

For example, look at least year's version of this thread titled
"The REALLY bad thing about Python lists.." where I expand on
a comment from Moshe and sketch an implementation of a ScalableList
which roughly doubles the list size when the list is full.
It has a linear performance hit because of function call overhead,
but if the lists are long enough to incur O(n**2) expansion
penalities, the function call overhead won't be a problem.

                    Andrew
                    dalke at acm.org
P.S.
Why does the python-list archive at
   http://mail.python.org/pipermail/python-list/
skip 2000-March through to 2000-September, and have astonishing
small messages for February and October?






More information about the Python-list mailing list