omer at no-log.org
Mon Jun 30 16:06:42 CEST 2008
Le Monday 30 June 2008 15:13:30 Larry Bates, vous avez écrit :
> Peter Otten wrote:
> > Ampedesign wrote:
> >> If I happen to have a list that contains over 50,000 items, will the
> >> size of the list severely impact the performance of appending to the
> >> list?
> > No.
> > $ python -m timeit -n20000 -s"items = " "items.append(42)"
> > 20000 loops, best of 3: 0.554 usec per loop
> > $ python -m timeit -n20000 -s"items = *10**6" "items.append(42)"
> > 20000 loops, best of 3: 0.529 usec per loop
> > http://wiki.python.org/moin/TimeComplexity
> > Peter
> So its actually faster to append to a long list than an empty one? That
> certainly would not have been intuitively obvious now would it?
That test only demonstrates that it's faster to append to a 1 million items
list than an empty one (and this on a particular platform with a particular
python version). Different sizes may give different result. I guess this is
because of some internal optimisations (items are probably allocated by
chunks, so sometimes append() involves a realloc, sometimes not).
So the only thing you should remember is that list.append() has a complexity
of O(1), and thus should be considered a constant time operation for any
length. Just be aware of the note:
 = These operations rely on the "Amortized" part of "Amortized Worst Case".
Individual actions may take surprisingly long, depending on the history of
Also note that 50000 items is a lot for a human being, not for a modern
More information about the Python-list