ark at research.att.com
Mon Aug 12 17:47:39 CEST 2002
>> I'd like to take a step back for a moment. I originally asked a
>> simple question:
Michael> Which did get answered in amongst all the noise, yes?
>> If I write an expression of the form s[i:j] == x, can I count on
>> the implementation optimizing it by avoiding a copy of s[i:j], so
>> that I can be assured of not having to think about finding more
>> efficient alternatives?
Michael> If you have a question that contains the phrase "can I count
Michael> on the implementation optimizing ..." then the answer is
Michael> almost certainly "no".
That's what I suspected, but I wanted to be certain. There are,
after all, some optimizations that people count on all the time.
x = 
for i in range(n):
Yes, I know that I could have written this as
x = range(n)
but in practice, one would append the results of a more complicated
computation. The point I'm trying to make here is that people write
code like this all the time and assume that the execution time is
going to be O(n) rather than O(n^2). As it happens, whether the
execution time is O(n) or O(n^2) depends on just how lists are
implemented, but I believe that I can count on them to be implemented
in such a way that the loop above will not go quadratic on me.
Andrew Koenig, ark at research.att.com, http://www.research.att.com/info/ark
More information about the Python-list