__delitem__ affecting performance

Karl H. no at spam.com
Thu Oct 19 10:51:28 CEST 2006


Hi,

I was performing some timing tests on a class that inherits from the 
built-in list, and got some curious results:

import timeit

class MyList(list):
     def __init__(self):
         list.__init__(self)
         self[:] = [0,0,0]

     def __delitem__(self,index):
         print 'deleting'

ml = MyList()

def test():
     global ml
     ml[0] += 0
     ml[1] += 0
     ml[2] += 0

t = timeit.Timer("test()","from __main__ import test")
print t.timeit()

 >> 4.11651382676

import timeit

class MyList(list):
     def __init__(self):
         list.__init__(self)
         self[:] = [0,0,0]

ml = MyList()

def test():
     global ml
     ml[0] += 0
     ml[1] += 0
     ml[2] += 0

t = timeit.Timer("test()","from __main__ import test")
print t.timeit()

 >> 2.23268591383

Does anybody know why defining __delitem__ is causing the code to run 
slower? It is not being called, so I don't see why it would affect 
performance. Overriding other sequence operators like __delslice__ does 
not exhibit this behavior.

The speed difference doesn't really bother me, but I am curious.

I used Python 2.4 for this test.

-Karl



More information about the Python-list mailing list