[Python-ideas] Keep free list of popular iterator objects
solipsis at pitrou.net
Sun Sep 15 14:09:53 CEST 2013
On Sun, 15 Sep 2013 13:52:39 +0200
"M.-A. Lemburg" <mal at egenix.com> wrote:
> > A best-case 15% improvement on a
> > trivial microbenchmark probably means a 0% improvement on real-world
> > workloads. Furthermore, using specialized freelists will increase
> > memory fragmentation and prevent the main allocator from returning
> > memory to the system.
> Keeping e.g. a hundred such objects in a free list shouldn't
> really affect the memory load of the Python interpreter.
Well, it can. The object allocator uses 256KB arenas, so if each of
the hundred objects in the free list keeps a different arena alive, we
are talking about a 25 MB fragmentation overhead.
Yes, that's a worse case (and irrealistic for common workloads)
overhead, but the 15% improvement is a best case (and very irrealistic
for common workloads) performance gain :-)
> A 15% improvement isn't a lot, but such small improvements
> add up if they are consistent and the net result is an overall
> performance improvement.
I've grown skeptical that such small improvements actually "add up" to
something significant. Performance differences between CPython versions
can generally be attributed to one or two important changes (hopefully
improvements :-)) such as e.g. PEP 393, the method lookup cache, or
Anyway, if there's a non-trivial benchmark that can measure the
real-world potential of this optimization, it would help the
More information about the Python-ideas