[issue19859] functools.lru_cache keeps objects alive forever

Raymond Hettinger report at bugs.python.org
Wed Dec 4 10:32:47 CET 2013


Raymond Hettinger added the comment:

> Limiting the cache size is also not a solution in the 
> practical example with request that I linked to in the
> previous comment, because we can't know in advance how
> many times per request the function is going to be called, 
> picking an arbitrary number feels wrong and may lead to 
> unexpected behaviors

This suggests that you don't really want an LRU cache which is specifically designed to limit the cache size by expelling the
least recently used entry.

At its heart, the cache decorator is all about mapping a fixed inputs to fixed outputs.  The memory conservation comes from the replacement strategy and an option to clear the cache entirely.

The reason that my answer and Serhiy's answer don't fit your needs is that it isn't clear what you really want to do.  I think you should move this discussion to StackOverflow so others can help you tease-out your actual needs and suggest appropriate solutions.  Ideally, you should start with real use cases rather than focusing on hacking-up the LRU cache implementation.

----------

_______________________________________
Python tracker <report at bugs.python.org>
<http://bugs.python.org/issue19859>
_______________________________________


More information about the Python-bugs-list mailing list