
At 03:07 AM 1/20/05 -0800, Guido van Rossum wrote:
Phillip worries that solving this with interfaces would cause a proliferation of "partial sequence" interfaces representing the needs of various libraries. Part of his proposal comes down to having a way to declare that some class C implements some interface I, even if C doesn't implement all of I's methods (as long as implements at least one). I like having this ability, but I think this fits in the existing proposals for declaring interface conformance: there's no reason why C couldn't have a __conform__ method that claims it conforms to I even if it doesn't implement all methods. Or if you don't want to modify C, you can do the same thing using the external adapter registry.
There are some additional things that it does in this area: 1. Avoids namespace collisions when an object has a method with the same name as one in an interface, but which doesn't do the same thing. (A common criticism of duck typing by static typing advocates; i.e. how do you know that 'read()' has the same semantics as what this routine expects?) 2. Provides a way to say that you conform, without writing a custom __conform__ method 3. Syntax for declaring conformance is the same as for adaptation 4. Allows *external* (third-party) code to declare a type's conformance, which is important for integrating existing code with code with type declarations
I'd also like to explore ways of creating partial interfaces on the fly. For example, if we need only the read() and readlines() methods of the file protocol, maybe we could declare that as follows::
def foo(f: file['read', 'readlines']): ...
FYI, this is similar to the suggestion from Samuele Pedroni that lead to PyProtocols having a: protocols.protocolForType(file, ['read','readlines']) capability, that implements this idea. However, the problem with implementing it by actually having distinct protocols is that declaring as few as seven methods results in 127 different protocol objects with conformance relationships to manage. In practice, I've also personally never used this feature, and probably never would unless it had meaning for type declarations. Also, your proposal as shown would be tedious for the declarer compared to just saying 'file' and letting the chips fall where they may.
Now on to the other problems Phillip is trying to solve with his proposal. He says, sometimes there's a class that has the functionality that you need, but it's packaged differently. I'm not happy with his proposal for solving this by declaring various adapting functions one at a time, and I'd much rather see this done without adding new machinery or declarations: when you're using adaptation, just write an adapter class and register it; without adaptation, you can still write the adapter class and explicitly instantiate it.
In the common case (at least for my code) an adapter class has only one or two methods, but the additional code and declarations needed to make it an adapter can increase the code size by 20-50%. Using @like directly on an adapting method would result in a more compact expression in the common case.
I have to admit that I totally lost track of the proposal when it started to talk about JetPacks. I believe that this is trying to deal with stateful adapters. I hope that Phillip can write something up about these separately from all the other issues, maybe then it's clearer.
Yes, it was for per-object ("as a") adapter state, rather than per-adapter ("has a") state, however. The PEP didn't try to tackle "has a" adapters at all.
Phillip's proposal reduces the amount of boilerplate in this class somewhat (mostly the constructor and the __getattr__() method),
Actually, it wouldn't implement the __getattr__; a major point of the proposal is that when adapting to an interface, you get *only* the attributes from the interface, and of those only the ones that the adaptee has implementations for. So, arbitrary __getattr__ doesn't pass down to the adapted item.
but apart from that it doesn't really seem to do a lot except let you put pieces of the adapter in different places, which doesn't strike me as such a great idea.
The use case for that is that you are writing a package which extends an interface IA to create interface IB, and there already exist numerous adapters to IA. As long as IB's additional methods can be defined in terms of IA, then you can extend all of those adapters at one stroke. In other words, external abstract operations are exactly equivalent to stateless, lossless, interface-to-interface adapters applied transitively. But the point of the proposal was to avoid having to explain to somebody what all those terms mean, while making it easier to do such an adaptation correctly and succinctly. One problem with using concrete adapter classes to full interfaces rather than partial interfaces is that it leads to situations like Alex's adapter diamond examples, because you end up with not-so-good adapters and few common adaptation targets. The idea of operation conformance and interface-as-namespace is to make it easier to have fewer interfaces and therefore fewer adapter diamonds. And, equally important, if you have only partial conformance you don't have to worry about claiming to have more information/ability than you actually have, which was the source of the problem in one class of Alex's examples. If you substitute per-operation adapters in Alex's PersonName example, the issue disappears because there isn't an adapter claiming to supply a middle name that it doesn't have; that operation or attribute simply doesn't appear on the dynamic adapter class in that case. By the way, this concept is also exactly equivalent to single-dispatched generic functions in a language like Dylan. In Dylan, a protocol consists of a set of abstract generic functions, not unlike the no-op methods in a Python interface. However, instead of adapting objects or declaring their conformance, you declare how those methods are implemented for a particular subject type, and that does not have to be in the class for the subject type, or in the class where the method is. And when you invoke the operation, you do the moral equivalent of 'file.read(file_like_object, bytes)', rather than 'file_like_object.read(bytes)', and the right implementation is looked up by the concrete type of 'file_like_object'. Of course, that's not a very Pythonic style, so the idea of this PEP was to swap it around so the type declaration of 'file' is automatically turning 'filelike.read(bytes)' into 'file_interface.read(filelike,bytes)' internally. Pickling and copying and such in the stdlib are already generic functions of this kind. You have a dictionary of type->implementation for each of these operations. The table is explicit and the lookup is explicit, and adaptation doesn't come into it, but this is basically the same as what you'd do in Dylan by having the moral equivalent of a 'picklable' protocol with a 'pickle(ob,stream)' generic function, and implementations declared elsewhere. So, the concept of registering implementations of an operation in an interface for a given concrete type (that can happen from third-party code!) certainly isn't without precedent in Python. Once you look at it through that lens, then you will see that everything in the proposal that doesn't deal with stateful adaptation is just a straightforward way to flip from 'operation(ob,...)' to 'ob.operation(...)', where the original 'operation()' is a type-registered operation like 'pickle', but created automatically for existing operations like file.read. So if it "does too much", it's only because that one concept of a type-dispatched function in Python provides for many possibilities. :)