Ben Wing schrieb:
i do see your point. i was trying to remember what i ended up doing when i ran into this issue before, and in fact i ended up just making a new list and appending all the elements i didn't want to delete. i see you'd get N^2 behavior if you deleted lots of elements from a list, but you get the same thing currently if you call `del list[i]' a lot; it's not obvious that iter.delete() actually "papers over" the cost any more than `del list[i]' does.
No, it wouldn't. OTOH, copying into a new list and then performing slice-assignment is O(N). So performance-wise, it is (surprisingly) asymptotically better to create a new list.
however, for hash tables there's no reason to use iter.delete(). you should just be able to write `del hash[x]'. is this disallowed currently?
Try for yourself: py> x={1:2,3:4} py> for k in x: ... if k < 2: ... del x[k] ... Traceback (most recent call last): File "<stdin>", line 1, in ? RuntimeError: dictionary changed size during iteration
if so, it seems like something that should be fixed.
It's difficult to fix: deleting an item may cause a resizing/rehashing of the entire dictionary, hence the dictionary is looked while being iterated over. I *think* that one could support deletion (but not addition); this would need further investigation. Regards, Martin