
On Thu, Dec 31, 2020 at 8:43 AM James Oldfield james.oldfield@cantab.net wrote:
- My first suggestion is to replace the odd cache logic with a
straightforward classic LRU cache. I'm hoping this is uncontroversial, but I can imagine there might be programs that somehow depend on the current behaviour. But my suspicion is that many more suffer and just no one looked hard enough to notice.
LRU has worst scenario too. And implementing LRU requires some memory. For example, functools.lru_cache() uses doubly-linked list internally. There are some LRU-like algorithms, like clock or double chance. But all algorithms have worst scenario anyway.
Another option is a random eviction algorithm. It chose random entry to evict.
* Frequently used keys have a high chance for cache hit, like LFU (least frequently used). * When frequently used keys are changed in time, old frequently used keys are eventually evicted. (behaves like least recently frequently used?) * Minimum overhead for managing keys.
So I think a random eviction algorithm is better than LRU.
Off topic: Can we add random_cache to functools, like lru_cache?
Regards,