[Python-3000] Adaptation [was:Re: Iterators for dict keys, values, and items == annoying :)]
Alex Martelli
aleaxit at gmail.com
Sun Apr 2 04:45:11 CEST 2006
On Apr 1, 2006, at 2:40 PM, Brett Cannon wrote:
> Woohoo! I get it, finally! Some comments below, but I suddenly feel
> a little less stupid since I get the whole process now! =)
Well, I guess this may be the first time I've presented the thing
properly, after all these many years -- in which case, the blame
definitely falls mostly on me ("mostly" only because SOME people got
it despite my inability to present it properly;-)...
> I am going to assume an optimization is possible where if an object
> meets a protocol it doesn't need it's contract explicitly stated. Is
OTOH, the blame for spellng "its" as "it's" -- assuming that's what
you're doing -- IS all on YOUR shoulders, Brett!!!
> that reasonable? The reason I ask is I could see an explosion of
> registration calls for objects trying to cover every protocol they
> match and then new protocols that have been defined doing for the
> objects, etc., and ending up still with some protocols missed since it
> seems to require some knowledge of what types will work.
Many optimizations are definitely possible, but I'm not sure which
ones are worthwhile.
> Take our __index__ example. I might want to use an object that I
> think can be used for indexing a sequence but I don't know about any
> specific protocols required and neither the __index__ protocol creator
> nor the object designer knew of each other and thus didn't bother with
> registering. Is it still going to work, or am I going to get an
> exception saying that the object didn't register for some protocol I
> wasn't aware of?
Well, *SOME*body must be aware of the type/protocol compliance: it
could be the protocol's author, the type's author, the application's
author, or a 4th party whose adaptations-module the application
imports -- but it cannot just happen "like black magic". Any one of
the four parties may perform the registration (or whatever other
action, equivalent e.g. to registering the identify function but
faster, gets picked as an optimization) -- but defaulting to
"everything satisfies every protocol" is not a real option.
Consider indexing: we want someseq[x] to fail (with a TypeError or
thereabouts) if x is an instance of float, decimal.Decimal, gmpy.mpf,
gmpy.mpq, str, unicode, ... -- a huge host of types that DO supply an
__int__ (so that int(x) would work), but whose __int__ has
"significant loss of information" and thus does NOT meet the
pragmatics of the protocol "being suitable as a sequence index".
A Python beginner might easily think that a float "can be used for
indexing a sequence" (with an automatic truncation to int), or even
that a string can (with an implicit str->int translation) -- but such
a beginner would be dead wrong. If you defaulted adaptation to
"everything satisfies every protocol unless otherwise stated", you'd
end up with a LOT of things which "happen to work"... but in fact can
silently fail, e.g. because a float used to index a sequence is
sometimes computed as 6.9999, truncated to 6, rather than rounded to
7 as the poor beginner expected. This being Python, *NOT* Perl, I
most definitely think we do NOT want to go there.
If you think that, most often, types will be written in full
knowledge of many existing protocols that they want to support, one
optimization might be to explicitly mark the types with an optional
__protocols__ attribute which is a set of directly supported
protocols. Istinctively I'm not enthusiastic about this optimization
-- I'm not sure it would buy you all that much. Rather, we might
want to supply an (e.g.) functools.identity built-in function, and
specialcase it when supplied as "the adapter" in registration (such a
functools.identity would have other uses too -- sure, 'lambda x: x'
is easy to write, but having a singleton identity-function might
allow other optimizations even quite apart from adaptation).
Semantically, registration suffices -- how best to optimize frequent
cases, I'm not certain.
> Also, if defaults are not implied, then a good way to handle
> registration of classes will need to be developed. This might be
> another place where class decorators come in handy over metaclasses
> since if inheritance comes into play then registering every subclass
> would be overkill.
One way to implement __protocols__ would be to have the metaclass
deal with it, of course -- with or without inheritance (I do believe
that it would be handier if inheritance of adaptation was in force by
default, but that's debatable, of course). But perhaps class
decorators are even nicer - I'm on the fence on the subject of class
decorators in general, particularly given the issue of whether they
should come before the class statement (like function decorators do)
or inside it (with @class or whatever) - I can see excellent
arguments on both sides.
> If we can make the default case for when an object implements a
> protocol dead-simple (if not automatic) in terms of registering or
> doing the right thing, then I can see this being really helpful.
Automatic is way too much, because you'd then have to state all the
protocols an object does NOT satisfy, which would be many, MANY more
than those it does meet. But surely, even without class decorators,
using:
class blah(bloh):
... body of class omitted ...
satisfies_protocols(blah, 'zip', 'zap', 'zop', 'balup')
isn't TOO complicated -- with
from functools import identity
def satisties_protocols(klass, *protocols):
for protocol in protocols:
register_adapter(klass, protocol, identity)
return klass # currently innocuous, support future decorator-
like use;-)
Similarly, if we supported protocol-satisfaction-inheritance by
default, a simple cannot_satisfy_protocols function could be used to
REMOVE from blah any protocols that are supported by bloh but blah
itself cannot support, maybe with a function cannot_support to be
registered as the adapter.
I would suggest not focusing too exclusively on these black-and-white
cases, where a protocol is supported "as is" or is definitely
unsupported. Adaptation is at its best where protocols are ALMOST
supported, and just need some little tweaking/wrapping to become
fully supported. For example, renaming methods or prebinding some
arguments, etc. These cases are unlikely to matter when somebody is
coding a new type intended to support existing protocols X, Y and Z
(it's then simplest for them to just code the new type according to
those protocols' constraints!) -- but they're going to be common
where adaptation shines.... for removal of impedance mismatches among
separately developed types and protocols. With Python's power, it's
child's play to make support for such cases into a simple callable,
too (left as an exercise for the reader, since pasta's just been
dropped and dinner's impending;-).
Alex
More information about the Python-3000
mailing list