Usually when I use lru_cache I don't want items to randomly disappearing from the cache. Just because a key isn't referenced anywhere else doesn't mean that I want it to automatically disappear. That is the whole point. Take this example:

@lru_cache
def expensive_function(a,b,c):
return a**b**c

#no reference kept, so the weakref key gets evicted.
print(expensive_function(5000,5000,5000))
#calls expensive_function all over again
print(expensive_function(5000,5000,5000))

Doesn't your idea take away half the point of lru_cache, or am I not understanding your suggestion?


October 15, 2020 1:49 PM, "Ram Rachum" <ram@rachum.com> wrote:
Hi everyone,
For many years, I've used a `cache` decorator that I built for caching Python functions. Then `functools.lru_cache` was implemented, which is much more standard. However, as far as I know, it holds normal references to its keys, rather than weak references. This means that it can cause memory leaks when it's storing items that don't have any references elsewhere. This often makes me reluctant to use it.
What do you think about supporting weakrefs in for keys lru_cache?
If I remember correctly, the main difficulty was that not all keys are of a type that can be weakreffed. If I remember correctly again, I've solved this by including logic that attempts a weakref when possible, and degrades to a strong ref. What do you think about that?
Thanks,
Ram.