[Python-ideas] re.compile_lazy - on first use compiled regexes

Stefan Behnel stefan_ml at behnel.de
Sun Mar 24 08:48:26 CET 2013

Eli Bendersky, 24.03.2013 04:39:
> How about examining what the size of that re cache is, and how much memory
> it typically occupies. Perhaps this cache can be changed to fit more
> regexes?

The problem is that there is no "default workload" to measure against. If a
stupid dispatch engine automatically generates tons of regexes to compare,
say, URLs against, you can make the cache as large as you want and it won't
help. I doubt that that's a serious use case, though - combining all of
them into a single regex (and then actually pre-compiling that statically
instead of relying on the cache) would be way smarter.

Apart from extreme cases like the above, a cache size of 512 sounds
*plenty* large, more like a "we don't know, so make it large" kind of
choice. Maybe measuring could actually help in making it *smaller*, but I
guess that's simply not worth it. If the space is not used, it won't grow
that large anyway.


More information about the Python-ideas mailing list