[Python-ideas] incremental hashing in __hash__
Stephen J. Turnbull
turnbull.stephen.fw at u.tsukuba.ac.jp
Fri Jan 6 02:07:06 EST 2017
Neil Girdhar writes:
> I don't understand this? How is providing a default method in an
> abstract base class a pessimization? If it happens to be slower
> than the code in the current methods, it can still be overridden.
How often will people override until it's bitten them? How many
people will not even notice until they've lost business due to slow
response? If you don't have a default, that's much more obvious.
Note that if there is a default, the collections are "large", and
equality comparisons are "rare", this could be a substantial overhead.
> > BTW, it occurs to me that now that dictionaries are versioned, in some
> > cases it *may* make sense to hash dictionaries even though they are
> > mutable, although the "hash" would need to somehow account for the
> > version changing. Seems messy but maybe someone has an idea?
> I think it's important to keep in mind that dictionaries are not versioned
> in Python. They happen to be versioned in CPython as an unexposed
> implementation detail. I don't think that such details should have any
> bearing on potential changes to Python.
AFAIK the use of the hash member for equality checking is an
implementation detail too, although the language reference does
mention that set, frozenset and dict are "hashed collections". The
basic requirements on hashes are that (1) objects that compare equal
must hash to the same value, and (2) the hash bucket must not change
over an object's lifetime (this is the "messy" aspect that probably
kills the idea -- you'd need to fix up all hashed collections that
contain the object as a key).
More information about the Python-ideas
mailing list