[Tutor] Complexity of list operations

Danny Yoo dyoo at cs.wpi.edu
Wed Oct 24 02:17:33 CEST 2007

```>> Use list when you want a sequence of items indexed by sequential
>> integers. Lists
>> - preserve order
>> - are fast for indexed lookup
>> - are relatively slow (O(n)) for adding, deleting and searching (though
>> for small lists this should not be a consideration)
>
> Are you sure they take O(n) for adding? I would have thought it would
> take O(1).

One thing to be careful of: Python "lists" are not linked lists: they
really are sequential array-like things that auto-expand.  That means
inserts can be potentially expensive depending on where we do the insert.

The worst case scenario for Python's list operations (insert, delete)
happens when we're applying those operations to the very head of the list.
The reasoning is because inserting an element forces us to move all the
other elements all up by one place: they've got to make room for their new
neighbor!  By the same reasoning, deletes from the front can also be a
linear-time operation because it forces the remainder of the elements to
fill in the gap.

(For the math heads out there: if it takes some constant 'k' time to move
an element, and if we start with an empty list and spam it with N inserts
at the front, then the total amount of time it takes to do those N
operations will be 0 + 1*k + 2*k + ... + (N-1)*k, and that's a "Gauss sum"
that collapses to (N*(N-1)/2)*k.  That is, in this worst case scenario, it
takes time quadratic in the number of elements to insert N elements!)

That being said, if all our operations are applied only at the end of the
list, the time complexity of those two operations look better.  We can
argue for O(1) behavior under that particular restriction.  But I think
Kent was considering the worst-case behavior, since it's an upper bound on