Loop performance disappearance
Remco Gerlich
scarblac-spamtrap at pino.selwerd.nl
Wed Mar 15 09:17:13 EST 2000
Mikael Johansson wrote in comp.lang.python:
> I was just wondering what the reason for the huge performance decrease
> in a loop execution when the number of steps exceeds some critical value
> is. To give an example:
>
> for i in range(loops):
> pass
>
> If loops=500 000 (space added for clarity) the "program" executes in ~2
> secs on my machine. But if loops is set to 5 000 000, the execution time
> rises to substantially more than tenfold (I terminated it after one
> minute). However the CPU-load is quite small, most of the time is spent
> disk swapping like crazy!
That's because range(loops) is a list. It actually has to construct a list
that size, and then walks through it. 5 million doesn't fit in your memory
and it starts swapping.
> Any ideas of the reason for this, work-arounds?
Use xrange instead of range. xrange doesn't return a list, but an object
that generates values on demand, simulating a list. Slightly slower, but
doesn't need the memory.
--
Remco Gerlich, scarblac at pino.selwerd.nl
Murphy's Rules, "Which is why you get 'em so cheap":
In SPI's Universe, the sword is prohibited from use at any combat
range.
More information about the Python-list
mailing list