LRU cache
avi.e.gross at gmail.com
avi.e.gross at gmail.com
Tue Feb 14 20:40:13 EST 2023
Chris,
That is a nice decorator solution with some extra features.
We don't know if the OP needed a cache that was more general purpose and
could be accessed from multiple points, and shared across multiple
functions.
-----Original Message-----
From: Python-list <python-list-bounces+avi.e.gross=gmail.com at python.org> On
Behalf Of Chris Angelico
Sent: Tuesday, February 14, 2023 5:46 PM
To: python-list at python.org
Subject: Re: LRU cache
On Wed, 15 Feb 2023 at 09:37, Dino <dino at no.spam.ar> wrote:
>
>
> Here's my problem today. I am using a dict() to implement a quick and
> dirty in-memory cache.
>
> I am stopping adding elements when I am reaching 1000 elements
> (totally arbitrary number), but I would like to have something
> slightly more sophisticated to free up space for newer and potentially
> more relevant entries.
>
> I am thinking of the Least Recently Used principle, but how to
> implement that is not immediate. Before I embark on reinventing the
> wheel, is there a tool, library or smart trick that will allow me to
> remove elements with LRU logic?
>
Check out functools.lru_cache :)
ChrisA
--
https://mail.python.org/mailman/listinfo/python-list
More information about the Python-list
mailing list