Re: PEP 246 and Protocols (Was: Sneaky 'super' instances)
At 03:42 PM 6/12/03 +0100, Moore, Paul wrote:
From: Phillip J. Eby [mailto:pje@telecommunity.com]
Open protocols solve the chicken and egg problem by allowing one to make declarations about third-party objects.
OK, I've tried to put together a simple example of "what I expect" and I find I can't. I want to continue to write code which "just assumes" that it gets the types it needs - I don't want to explicitly state that I need specific interface types - that feels like type declaration and Java. Your IFile example reinforces that, both in terms of its naming convention, and in the assumption that there *is* a single, usable, "file" protocol. I know it was my example, but I pointed out later in the same message that I really sometimes wanted seekable readline-only files, and other times block read/write (potentially unseekable) files.
Expecting library writers to declare interface "classes" for every subtle variation of requirements seems impractical. Expecting the requirements to be *documented* is fine - it's having a concrete class which encapsulates them that I don't see happening - no-one would ever look to see if there was already a "standard" interface which said "I have readline, seek, and tell" - they'd just write a new one. There goes any hope of reuse. (This may be what you see as a "chicken and egg" problem - if so, I see it as a "hopelessly unrealistic expectations" problem instead, because it's never going to happen...)
Even if it's as simple as saying this: class ReadlineSeekAndTell(Interface): """You must have readline, seek, and tell if you pass this to me""" advise(protocolIsSubsetof=[IFile]) Or, even more briefly, using Samuele's way: RST = subproto(IFile,['readline','seek','tell'])
On the other hand, expecting callers to write stuff to adapt existing classes to the requirements of library routines is (IMHO) a non-issue. I think that's what PEP 246 was getting at in the statement that "The typical Python programmer is an integrator" which you quote. It's common to write
class wrap_source: def __init__(self, source): self.source = source def read(self, n = 1024): return self.source.get_data(n)
lib_fn(wrap_source(my_source))
So PEP 246 is trying to make writing that sort of boilerplate easier (in my view). The *caller* should be calling adapt(), not the callee.
That depends. It's common in both Zope and Twisted for framework code to do the Zope or Twisted equivalents of 'adapt()'. But yes, it is also common to state that some function in effect requires an already adapted object.
<snip> Which is appropriate is basically down to which came first, string or file. But both suffer from the problem of putting all the knowledge in one location (and using a type switch as well).
Right. That's what the point of the open protocols system is. The problem is that in every Python interface system I know of (except PyProtocols), you can't declare that package A, protocol X, implies package B, protocol Y, unless you are the *author* of package B's protocol Y. But if package A's author used an open protocol to define protocol X, then you can use PyProtocols today to say that it implies package B's protocol Y, as long as protocol Y can be adapted to the declaration API. Translation: anybody who defines protocols for their packages today using PyProtocols, will be able to have third parties define adapters to interfaces provided by Zope or Twisted, or that are defined using Zope or Twisted interfaces.
The third option, which is the "external registry" allows a *user* of the string and file "libraries" (bear with me here...) to say how he wants to make strings look like files:
<snip> So I see PEP 246 as more or less entirely a class based mechanism. It has very little to do with constraining (or even documenting) the types of function parameters.
PEP 246 specifically said it wasn't tackling that case, however, and I personally don't feel that a singleton of such broad scope is practical; it seems more appropriate to make protocols responsible for their own adapter registries, which is what PyProtocols (and to a lesser extent Twisted) do.
Of course, a library writer can define interface classes, and the adaptation mechanism will allow concrete classes to be made to "conform" to those interfaces, but it's not necessary. And given the general apathy of the Python community towards interfaces (at least as far as I see) I don't imagine this being a very popular use pattern.
When designing PyProtocols I did some research on the type-sig, and other places... I got the distinct impression that lots of Pythonistas like interfaces, as long as they're abstract base classes. PyProtocols allows that. But, a protocol can also be a pure symbol, e.g.: import protocols myInterface = protocols.Protocol() Voila. There's your interface. Let the user register what they will. Sure, it doesn't *document* anything that isn't stated by its name, but that's your users' problem. ;) At least they can import 'yourpackage.myInterface' and register adapters now.
And describing a PEP 246 style facility in terms of interfaces could be a big turn-off. (This applies strongly to your PyProtocols code - I looked for something like the pep246.register function I suggested above, but I couldn't manage to wade through all the IThis and IThat stuff to find it, if it was there...)
Unfortunately, I biased the reference documentation towards explaining the architecture and what you need to do to integrate with the framework itself, even though 90% of the actual intended audience of PyProtocols will never need to do anything like that. In a sense, you could say that the current reference manual is much more of an embedding and extending manual, instead of being the Python tutorial.
Once again, I apologise if I've missed your point. But I hope I've explained what I see as the point of PEP 246, and where your proposal is going off in a different direction.
I understand what you're saying, although I'm not sure how you interpreted PEP 246 that way, when it explicitly states that it *doesn't* cover case "e". And I didn't diverge from PEP 246 in that respect. What I effectively said is, "well, let's make it so that case "e" is irrelevant, because there are plenty of __adapt__-capable protocols, and because you can register the relationships between them." Thus, the solution to case "e" is effectively: Don't use built-in types as protocols!
But the fact remains, that neither PEP 246 nor PyProtocols has any need to be in the core, or even in the standard library (yet). Wide usage could change that. Usage in something that is destined for the standard library could, too, but that one could go the other way (the "something" gets rejected because it depends on PEP246/PyProtocols).
Right, that was why I was asking.
PS This is *way* off-topic for python-dev by now. I suggest that we leave things here. I don't have anything else worth adding...
I suppose we could always resurrect the types-sig.... Although, as Tim Peters has frequently suggested, nothing ends discussion more quickly than having an implementation available. :)
Paul, The goal of PEP 246 is to create a loose coupling between various framework components to facilitate the 'adapter' pattern as described in the GOF pattern book. It is also motivated by Microsoft's QueryInterface, which when used intelligently is extremely powerful. The primary audience of PEP 246 are component/library developres who wish for their components to interoperate between frameworks. Unless you are writing components and wish for them to work across, say Zope, Twisted, and Webware, using a single code base in a managable manner then you may not understand the PEP. In particular, PEP 246 will be successful if your average programmer does not even know about adapt! He simply pluggs component X into framework Y and it works... automagically. For example, I have a 'Flow' manager (a generator based cooperative multitasking library). Currently it is dependent upon twisted, however, there are really only two protocols I depend upon: (a) wrapping an exception for later delivery, (b) some sort of event loop for scheduling delayed execution. With PEP 246 I'd write an interface flow.IFailure and flow.IReactor, and provide clean ways for it to work with Twisted. Then, if some other framework wanted to "re-use" my component, it'd be easy, they'd simply add a __conform__ to their objects, and __adapt__ my objects. This is far more powerful than requiring my code or their code to directly know about each other and the interface mechanism that each of them uses. I hope this helps, Clark On Thu, Jun 12, 2003 at 12:48:51PM -0400, Phillip J. Eby wrote: | At 03:42 PM 6/12/03 +0100, Moore, Paul wrote: | >From: Phillip J. Eby [mailto:pje@telecommunity.com] | >> Open protocols solve the chicken and egg problem by allowing one | >> to make declarations about third-party objects. | > | >OK, I've tried to put together a simple example of "what I expect" and | >I find I can't. I want to continue to write code which "just assumes" | >that it gets the types it needs - I don't want to explicitly state that | >I need specific interface types - that feels like type declaration and | >Java. Your IFile example reinforces that, both in terms of its naming | >convention, and in the assumption that there *is* a single, usable, | >"file" protocol. I know it was my example, but I pointed out later in | >the same message that I really sometimes wanted seekable readline-only | >files, and other times block read/write (potentially unseekable) files. | > | >Expecting library writers to declare interface "classes" for every | >subtle variation of requirements seems impractical. Expecting the | >requirements to be *documented* is fine - it's having a concrete class | >which encapsulates them that I don't see happening - no-one would ever | >look to see if there was already a "standard" interface which said "I | >have readline, seek, and tell" - they'd just write a new one. There | >goes any hope of reuse. (This may be what you see as a "chicken and egg" | >problem - if so, I see it as a "hopelessly unrealistic expectations" | >problem instead, because it's never going to happen...) | | Even if it's as simple as saying this: | | class ReadlineSeekAndTell(Interface): | """You must have readline, seek, and tell if you pass this to me""" | advise(protocolIsSubsetof=[IFile]) | | | Or, even more briefly, using Samuele's way: | | RST = subproto(IFile,['readline','seek','tell']) | | | | >On the other hand, expecting callers to write stuff to adapt existing | >classes to the requirements of library routines is (IMHO) a non-issue. | >I think that's what PEP 246 was getting at in the statement that "The | >typical Python programmer is an integrator" which you quote. It's | >common to write | > | > class wrap_source: | > def __init__(self, source): | > self.source = source | > def read(self, n = 1024): | > return self.source.get_data(n) | > | > lib_fn(wrap_source(my_source)) | > | >So PEP 246 is trying to make writing that sort of boilerplate easier | >(in my view). The *caller* should be calling adapt(), not the callee. | | That depends. It's common in both Zope and Twisted for framework code to | do the Zope or Twisted equivalents of 'adapt()'. But yes, it is also | common to state that some function in effect requires an already adapted | object. | | | ><snip> | >Which is appropriate is basically down to which came first, string or | >file. But both suffer from the problem of putting all the knowledge | >in one location (and using a type switch as well). | | Right. That's what the point of the open protocols system is. The problem | is that in every Python interface system I know of (except PyProtocols), | you can't declare that package A, protocol X, implies package B, protocol | Y, unless you are the *author* of package B's protocol Y. | | But if package A's author used an open protocol to define protocol X, then | you can use PyProtocols today to say that it implies package B's protocol | Y, as long as protocol Y can be adapted to the declaration API. | | Translation: anybody who defines protocols for their packages today using | PyProtocols, will be able to have third parties define adapters to | interfaces provided by Zope or Twisted, or that are defined using Zope or | Twisted interfaces. | | | >The third option, which is the "external registry" allows a *user* of | >the string and file "libraries" (bear with me here...) to say how he | >wants to make strings look like files: | > | ><snip> | >So I see PEP 246 as more or less entirely a class based mechanism. It | >has very little to do with constraining (or even documenting) the | >types of function parameters. | | PEP 246 specifically said it wasn't tackling that case, however, and I | personally don't feel that a singleton of such broad scope is practical; it | seems more appropriate to make protocols responsible for their own adapter | registries, which is what PyProtocols (and to a lesser extent Twisted) do. | | | >Of course, a library writer can define interface classes, and the | >adaptation mechanism will allow concrete classes to be made to | >"conform" to those interfaces, but it's not necessary. And given | >the general apathy of the Python community towards interfaces (at | >least as far as I see) I don't imagine this being a very popular use | >pattern. | | When designing PyProtocols I did some research on the type-sig, and other | places... I got the distinct impression that lots of Pythonistas like | interfaces, as long as they're abstract base classes. PyProtocols allows | that. But, a protocol can also be a pure symbol, e.g.: | | import protocols | | myInterface = protocols.Protocol() | | | Voila. There's your interface. Let the user register what they | will. Sure, it doesn't *document* anything that isn't stated by its name, | but that's your users' problem. ;) At least they can import | 'yourpackage.myInterface' and register adapters now. | | | >And describing a PEP 246 style facility in terms of | >interfaces could be a big turn-off. (This applies strongly to your | >PyProtocols code - I looked for something like the pep246.register | >function I suggested above, but I couldn't manage to wade through all | >the IThis and IThat stuff to find it, if it was there...) | | Unfortunately, I biased the reference documentation towards explaining the | architecture and what you need to do to integrate with the framework | itself, even though 90% of the actual intended audience of PyProtocols will | never need to do anything like that. In a sense, you could say that the | current reference manual is much more of an embedding and extending manual, | instead of being the Python tutorial. | | | >Once again, I apologise if I've missed your point. But I hope I've | >explained what I see as the point of PEP 246, and where your proposal | >is going off in a different direction. | | I understand what you're saying, although I'm not sure how you interpreted | PEP 246 that way, when it explicitly states that it *doesn't* cover case | "e". And I didn't diverge from PEP 246 in that respect. What I | effectively said is, "well, let's make it so that case "e" is irrelevant, | because there are plenty of __adapt__-capable protocols, and because you | can register the relationships between them." Thus, the solution to case | "e" is effectively: | | Don't use built-in types as protocols! | | | >But the fact remains, that neither PEP 246 nor PyProtocols has any | >need to be in the core, or even in the standard library (yet). Wide | >usage could change that. Usage in something that is destined for the | >standard library could, too, but that one could go the other way (the | >"something" gets rejected because it depends on PEP246/PyProtocols). | | Right, that was why I was asking. | | | >PS This is *way* off-topic for python-dev by now. I suggest that we | > leave things here. I don't have anything else worth adding... | | I suppose we could always resurrect the types-sig.... Although, as Tim | Peters has frequently suggested, nothing ends discussion more quickly than | having an implementation available. :) | | | _______________________________________________ | Python-Dev mailing list | Python-Dev@python.org | http://mail.python.org/mailman/listinfo/python-dev
Clark C. Evans
The goal of PEP 246 is to create a loose coupling between various framework components to facilitate the 'adapter' pattern as described in the GOF pattern book.
The adapter pattern is much simpler idea of writing a wrapper and possibly using multiple inheritance to link an implementation to a foreign interface. PEP 246 is a higher level protocol for seeking and applying pre-existing wrappers. Of course you know that, but the summary above tends to equate 246 with the GOF adapter pattern.
The primary audience of PEP 246 are component/library developres who wish for their components to interoperate between frameworks.
This is why I don't think adapt() should be a builtin. adapt.adapt() is not any harder to call than random.random() or glob.glob().
Unless you are writing components and wish for them to work across, say Zope, Twisted, and Webware, using a single code base in a managable manner then you may not understand the PEP. In particular, PEP 246 will be successful if your average programmer does not even know about adapt! He simply pluggs component X into framework Y and it works... automagically.
Yep. All it takes is one user to contribute code to support a protocol. My expectation is that the adapter code will (for the most part) be contributed by a sophisticated component user rather than the original supplier of the component. After all, if the supplier thought the foreign protocol was important, they would have supplied it in the first place. The nice thing about 246 is that the adapter code can be contributed without altering the original component code (OCP: open for extension, closed for modification). I don't see any significant downside to 246. The code is simple enough. It is useful in at least some cases and provides some support for interoperability. I don't think it makes writing adapters any easier -- it does nothing to mitigate the problem of widely differing protocols with different underlying assumptions. Getting complete, bug free general purpose adoption will still remain a hard problem. Cie le vie. So, my questions is whether there is any reason not to adopt 246 right away? AFAICT, there is no competing proposal and nothing that would be harmed by adoption. What's all the fuss about? Raymond Hettinger ################################################################# ################################################################# ################################################################# ##### ##### ##### ################################################################# ################################################################# #################################################################
At 12:50 AM 6/13/03 -0400, Raymond Hettinger wrote:
So, my questions is whether there is any reason not to adopt 246 right away? AFAICT, there is no competing proposal and nothing that would be harmed by adoption. What's all the fuss about?
Guido has said (and I agree) that if Python includes adapt(), then the Python standard library should use it. However, using it requires that the standard library have some standard way at least for it) of defining what a "protocol" is. And that's where the holdup is, because Guido is not yet decided about what kind of interfaces or protocols Python should have, and he won't be until he's had enough experience with some of the approaches. He's said he doesn't think Zope-style interfaces are going to be the way he wants to go, and that something more ABC-like (Abstract Base Class) would be preferable. PyProtocols was my attempt to show that a PEP 246 mechanism can actually be pretty agnostic about what kind of interface objects are used, just like the '+' operator is agnostic about what object types it's used with. However, as I've realized from this thread, PyProtocols still doesn't actually solve the real issue for Guido: even though PyProtocols doesn't care about what kind of interfaces are used, Guido *still* has to decide what kind the standard library should use, and I've only *added* another approach for him to evaluate! Ah well. :) On the bright side, I think PyProtocols can alleviate *one* of his concerns, which was that having a Python-included interface type would make other interface types (e.g. Zope interfaces) "second-class citizens". That is, I've demonstrated that it is possible to have a "protocol protocol", thus allowing different protocol or interface types to exist, even if they have no implementation in common (e.g. Twisted, Zope, and PyProtocols). At this point, though, I don't see any reason to rush PEP 246 into the standard library, since there is now a packaged distribution that will presumably become a de facto standard for anybody who wants to use PEP 246, as it's the only PEP 246 implementation that you can use "out of the box" without having to write __conform__ or __adapt__ methods. The only reason I even brought up the question here, was in relation to the adaptation-based documentation toolkit that I intend to write.
On Fri, Jun 13, 2003 at 12:50:53AM -0400, Raymond Hettinger wrote: | PEP 246 is a higher level protocol for seeking and applying | pre-existing wrappers. Right. | > The primary audience of PEP 246 are component/library developres | > who wish for their components to interoperate between frameworks. | | This is why I don't think adapt() should be a builtin. | adapt.adapt() is not any harder to call than random.random() | or glob.glob(). Make sense, although it would be nice if it was in the standard library so that people could start using it. There really doesn't seem to be any challengers; and without it in the standard library it will be harder to facilitate adoption. For example, Twisted relys upon code in Twisted in Python, and no other third party modules save pyOpenSSL. While PyProtocols may be out there, it may be harder to encourage Twisted to include it as part of their library without assigning copyright to Glyph. (Or am I mistaken) | Yep. All it takes is one user to contribute code to support a protocol. | My expectation is that the adapter code will (for the most part) be | contributed by a sophisticated component user rather than the original | supplier of the component. After all, if the supplier thought the | foreign protocol was important, they would have supplied it in | the first place. Perhaps. Although I think people in Twisted would include adapters for Zope or Webware and vice versa. | The nice thing about 246 is that the adapter code | can be contributed without altering the original component code | (OCP: open for extension, closed for modification). Exactly. | I don't see any significant downside to 246. The code is simple | enough. It is useful in at least some cases and provides some | support for interoperability. I don't think it makes writing | adapters any easier -- it does nothing to mitigate the problem | of widely differing protocols with different underlying assumptions. | Getting complete, bug free general purpose adoption will still | remain a hard problem. Cie le vie. Yes. | So, my questions is whether there is any reason not to adopt 246 | right away? AFAICT, there is no competing proposal and nothing | that would be harmed by adoption. What's all the fuss about? On Fri, Jun 13, 2003 at 08:19:59AM -0400, Phillip J. Eby wrote: | Guido has said (and I agree) that if Python includes adapt(), then the | Python standard library should use it. When adapt() emerged, iterators were being formulated, and what motivated me to write up the PEP is that I thought that the function iter() provides (adapting an object to an iterator protocol) needed to be more generic. There are plenty of places within Python where it could be applied. | PyProtocols was my attempt to show that a PEP 246 mechanism can | actually be pretty agnostic about what kind of interface objects | are used, just like the '+' operator is agnostic about what | object types it's used with. Yes, but it also removes some requirements on an interface mechanism. For instance, most of the interface strategies involve some sort of list of interfaces, which could be changed easily without altering the inheritance hierarchy. This is needed to be more 'dynamic' than what inheritance could tell you. With adapt(), this requirement is less important, since someone could always implement __conform__ or __adapt__ to get the dynamic behavior they want. The risk of these other, non-inheritance mechanisms is that while they may be more dynamic, they may be brittle (or ugly, IMHO). | Ah well. :) On the bright side, I think PyProtocols can alleviate | *one* of his concerns, which was that having a Python-included interface | type would make other interface types (e.g. Zope interfaces) "second-class | citizens". That is, I've demonstrated that it is possible to have a | "protocol protocol", thus allowing different protocol or interface | types to exist, even if they have no implementation in common (e.g. | Twisted, Zope, and PyProtocols). Guido could start with a 'common demoninator' between what Twisted and Zope do. That is, he could define a protocol as a class/object which uses inheritance, but not specify other mechanisms in which one implements the interface, leaving this out in the open. For example, he could create an interfaces.py which looks like... # inside interfaces.py class Iterable(object): def next(): """ returns the next value in the iteration, or raises StopIteration to finish up """ Clearly someone could inherit from interfaces.Iterable to signal that their object is iterable. However, they could also use other mechanism to signal that they are iterable, namely a __iter__ method... and in this case, def iter(obj): return adapt(Iterable, obj) Best, Clark
| On Fri, Jun 13, 2003 at 12:50:53AM -0400, Raymond Hettinger wrote: | | I don't see any significant downside to 246. The code is simple | | enough. It is useful in at least some cases and provides some | | support for interoperability. | ... | | So, my questions is whether there is any reason not to adopt 246 | | right away? AFAICT, there is no competing proposal and nothing | | that would be harmed by adoption. What's all the fuss about? | | On Fri, Jun 13, 2003 at 08:19:59AM -0400, Phillip J. Eby wrote: | | Guido has said (and I agree) that if Python includes adapt(), then the | | Python standard library should use it. | ... | | PyProtocols was my attempt to show that a PEP 246 mechanism can | | actually be pretty agnostic about what kind of interface objects | | are used, just like the '+' operator is agnostic about what | | object types it's used with. | ... | | Ah well. :) On the bright side, I think PyProtocols can alleviate | | *one* of his concerns, which was that having a Python-included interface | | type would make other interface types (e.g. Zope interfaces) "second-class | | citizens". That is, I've demonstrated that it is possible to have a | | "protocol protocol", thus allowing different protocol or interface | | types to exist, even if they have no implementation in common (e.g. | | Twisted, Zope, and PyProtocols). Could this approach work: - use regular class inheritance for static requirements - use adapt() for dynamic or custom needs (as shown below) - let specific use cases further refine the requirements This whole 'Interface' issue has been in the works for over three years now, and potentially great interoperability between frameworks and components is being lost. For example, why not just have a 'interfaces.py' in the standard library. The interface for iterators could be something like... # inside interfaces.py class Iterable(object): def next(): """ returns the next value in the iteration, or raises StopIteration to finish up """ def __conform__(cls, obj): try: return obj.__iter__() except AttributeError: pass Then, iter() could be defined something like... def iter(obj): return adapt(Iterable, obj) Best, Clark
At 09:44 PM 6/13/03 +0000, Clark C. Evans wrote:
Could this approach work:
- use regular class inheritance for static requirements - use adapt() for dynamic or custom needs (as shown below) - let specific use cases further refine the requirements
This whole 'Interface' issue has been in the works for over three years now, and potentially great interoperability between frameworks and components is being lost. For example, why not just have a 'interfaces.py' in the standard library. The interface for iterators could be something like...
This is moot. Guido has made it clear that until he's had time to think over all the issues, *no* interface implementation is getting in the standard library. Meanwhile, PyProtocols is available under the PSF, so any framework that wants to bundle it in their distribution is free to do so. If it becomes popular, I suppose there will need to be a versioning mechanism so a framework doesn't overwrite a newer version with an older one. That's the only issue that would even be marginally resolved by inclusion in the standard library, apart from the implied "blessing". However, I don't know that the "blessing" is even required at this point. If PyProtocols (perhaps with the enhancements suggested by Samuele and others here) doesn't catch on, I'm not sure what else *could* make PEP 246 popular. Or to put it another way, if it doesn't catch on by its own merits, why decorate the standard library with interfaces to support it? This is why one of my requirements for PyProtocols was that it absolutely had to support declaring adapters for built-in types, and why another one was that it should be possible for two people to independently declare equivalent or overlapping protocols, and then have a third party glue them together. PEP 246 without an "open protocols" protocol, doesn't support this latter requirement. That, I believe, is the heart of the "chicken and egg" problem for PEP 246 as a standalone protocol. It's sort of like having 'iter()' but not having 'next()', or maybe the other way around. :)
On Thursday 12 June 2003 06:48 pm, Phillip J. Eby wrote: ...
effectively said is, "well, let's make it so that case "e" is irrelevant, because there are plenty of __adapt__-capable protocols, and because you can register the relationships between them." Thus, the solution to case "e" is effectively:
Don't use built-in types as protocols!
Speaking as PEP 246's co-author and original champion of "case [e]", I think you've hit the nail on the head. Yes, singling out protocols as the "new kid on the block" and giving each protocol the responsibility of handling the registry of its adapters definitely strikes me as the right architecture, better than the vaguely defined one I had in mind (too vague to put in the PEP) which as you say / imply elsewhere would necessarily end up with a major case of chronical singletonitis. I'm not sure about some other details of PyProtocols (guess I need to study the whole package very, very carefully) but this general architectural idea gets a wholehearted +1 from me. Alex
participants (4)
-
Alex Martelli
-
Clark C. Evans
-
Phillip J. Eby
-
Raymond Hettinger