One Python 2.1 idea
Alex Martelli
aleaxit at yahoo.com
Tue Dec 26 13:43:52 EST 2000
<rturpin at my-deja.com> wrote in message news:92ab9u$dou$1 at nnrp1.deja.com...
> In article <929n9c013m3 at news1.newsguy.com>,
> "Alex Martelli" <aleaxit at yahoo.com> wrote:
> > No implicit caching.
>
> Has there been some discussion about caching? If done
> well, the vast majority of references would reach
> their object's attribute or method for the cost of a
> couple of integer compares and array fetches. This
Yep -- caching is good. No doubt about that.
> should be an order of magnitude faster than string
> lookup in a dictionary. This especially benefits
> method invocation, since that costs a failed lookup
> to the instance dictionary, and a successful lookup
> in the class dictionary. That's almost enough to
> scare me away from methods!
Looking up builtin functions is two dictionary lookups
too (failed in globals, successful in builtins). But for
methods it can be more than 2 (lookup in bases).
> Most work in a Python program involves object methods
> or attributes. (At least, this is the case in my
> programs. ;-) Increasing the speed of reference to
> these should have appreciable benefit, across the
> board. Yeah, it would make the code a bit more
> complex to maintain. But this isn't a special case --
> it is a general mechanism central to how Python works.
It does seem likely that caching lookups is where
most across-the-board performance might be gained,
yes. The trick is finding out a way to invalidate the
cache every time some dynamic manipulation may
need lookups to happen afresh.
Alex
More information about the Python-list
mailing list