O(n^2) is bad - can it be fixed?

Isaac To Kar Keung kkto at csis.hku.hk
Tue May 22 06:34:10 EDT 2001


>>>>> "Alex" == Alex Martelli <aleaxit at yahoo.com> writes:

    Alex> Not quite -- Python's sources are pretty clear about that:-).  A
    Alex> list DOES allocate a few extra entries when it grows, but it does
    Alex> NOT grow geometrically.  So IN THEORY appending should be O(N^2)
    Alex> for lists as well -- but in practice it seems N never gets big
    Alex> enough to actually measure that effect.

The relevant code:

#define ROUNDUP(n, PyTryBlock) \
        ((((n)+(PyTryBlock)-1)/(PyTryBlock))*(PyTryBlock))

static int
roundupsize(int n)
{
        if (n < 500)
                return ROUNDUP(n, 10);
        else
                return ROUNDUP(n, 100);
}

That means, when size is below 500, a list size will always be multiple of
10.  When size is larger, it is always multiple of 100.  Of course, if the
object is already of the right size, the system resize() does nothing.

Seems like magic to me... I run the following program and I didn't see *any*
slowdown, when each of the lists is as large as 8M, eating up 60M memory...

  a = []
  b = []
  i = 0
  while 1:
    i = i + 1
    if i % 100000 == 0:
      print i
    a.append(0)
    b.append(0)

(Will it be the system realloc which actually do the geometric scaling, or
is it really that geometric scaling is unnecessary?)

Regards,
Isaac.




More information about the Python-list mailing list