
At 05:56 PM 2/26/04 -0500, Glyph Lefkowitz wrote:
On Thu, 2004-02-26 at 14:59, Phillip J. Eby wrote:
Yep, just subclass InterfaceClass and add a __call__ method that calls back to adapt() and you'd be all set. One other benefit to PyProtocols is that you can track what interfaces an *instance* supports, independent of what its class supports.
Would you consider adding this to PyProtocols directly?
I have considered it. Unfortunately, it runs counter to one PyProtocols use case I'd like to support, which is the ability to use an abstract base class as a protocol. That is, somebody should be able to instantiate non-abstract subclasses of a protocol. While I don't personally use this except for some stuff in peak.util.fmtparse, I've had some previous indications that Guido may favor an ABC-style (Abstract Base Class) for interfaces when they eventually land in Python. So, I'd like to avoid making it impossible to instantiate interface subclasses. Again, this would be easily solved by a Twisted-specific subclass of protocols.InterfaceClass, and I don't see that doing it is necessarily a bad thing for either Twisted or PyProtocols, although it may be that it should be considered simply a transitional, backward-compatibility thing.
If we're going to maintain a component system as part of Twisted, I would like to at least get the benefit of having full control over the component system within Twisted. I don't want to have some people using PyProtocols and others using PyProtocols+TwistedHacks.
I'm not sure I follow you here. Private extensions to PyProtocols' base classes is certainly permitted and encouraged, to provide additional features needed by particular frameworks, as long as the core interfaces are respected (e.g. 'IOpenProtocol'). PyProtocols itself offers several specialized protocol implementations, including Protocol, InterfaceClass, Variation, URIProtocol, SequenceProtocol, and so on.
There are a lot of features I'd add to Twisted's component system if I had time, such as:
- implicit context-dependent location of closest running t.a.service services by interface - interface-based context (moshez's context trick) - automatic generation of interfaces from any class - IComponentized - context / interface based log separations
Actually, if I understand correctly, these mostly sound like things outside PyProtocols' scope. peak.binding and peak.config implement some of this stuff by defining various interfaces they want, and using PyProtocols to adapt things to those interfaces. But that's entirely independent of PyProtocols itself. In other words, PyProtocols isn't tightly coupled to a component architecture, but is instead a convenient base for building component architectures.
And of course, integrating foom's string-based components would be great too. There is a lot of friction even to add something like this to Twisted. I imagine that adding something like this to PyProtocols, with potentially more projects out there depending on the exact specifics of all its semantics, would be even worse.
The other alternative is to add a bunch of specific hacks to PyProtocols that get loaded only when Twisted gets loaded, which could potentially introduce compatibility problems with other PyProtocols-using code, which would sort of invalidate the whole point of using a common components system in the first place.
Again, if I understand correctly, these are services you would build atop PyProtocols or using PyProtocols, and wouldn't need to extend PyProtocols for. Let's take a specific example: you mentioned locating nearby services by interface. peak.config does this with two interfaces: IConfigSource and IConfigKey: class IConfigSource(Interface): """Something that can be queried for configuration data""" def _getConfigData(forObj, configKey): """Return a value of 'configKey' for 'forObj' or 'NOT_FOUND' Note that 'configKey' is an 'IConfigKey' instance and may therefore be a 'PropertyName' or an 'Interface' object.""" class IConfigKey(Interface): """Configuration data key, used for 'config.lookup()' et al Configuration keys may be polymorphic at registration or lookup time. IOW, when looking up a configuration key, you can search multiple values that would imply the key being looked for. And, when registering a value for a configuration key, the key can supply alternate keys that it should be registered under. Thus, an 'IConfigKey' is never itself directly used as a key, only the values supplied by its 'registrationKeys()' and 'lookupKeys()' methods are used. (However, those values must themselves be adaptable to 'IConfigKey', and they must be usable as dictionary keys.) """ def registrationKeys(depth=0): """Iterate over (key,depth) pairs to be used when registering""" def lookupKeys(): """Iterate over keys that should be used for lookup""" Now, some of the things we want to use as configuration keys are interfaces. So, we use PyProtocols to declare adapters that implement IConfigKey for all the interface types we work with (PyProtocols interfaces, Zope interfaces, and Twisted interfaces). And of course we declare our "placeful" components as implementing IConfigSource. Now, the API to look something up amounts to iterating over parent components, adapting them to IConfigSource, and passing them the needed configuration key, after having first adapted it to IConfigKey. Now, let's say somebody wants to use a Twisted placeful component with PEAK or vice versa... they just declare adapters to whichever interfaces aren't implemented, and life is good. There's absolutely no reason you'd need to change PyProtocols for this, nor would you need to make Twisted's interfaces for component lookup anything like PEAK's. Heck, if somebody wanted to, they could declare an IConfigKey adapter for Twisted's interface class, and then PEAK would be able to use all its existing lookup and component-binding APIs using Twisted interfaces as keys. And that's *without* Twisted using PyProtocols. :)
Then we have the issue of the PyProtocols dependency; dependency management can be quite hairy on windows.
Indeed. I've begun correspondence with Bob off-list about the possibility of me helping to port PIMP/PackMan to other platforms, though.
Parts of PyProtocols do strike me as dangerous, evil, and overcomplex, though :) In particular,
http://peak.telecommunity.com/protocol_ref/proto-implication.html
The idea of passing numeric priorities for different implementations has always seemed deeply wrong to me.
I understand. However, I have yet to encounter a situation where I've actually used or needed to use it. And, I consider passing explicit depth arguments to PyProtocols a hack: a side-effect of the implementation rather than an intentional design feature.
I have worked with one or two systems like this in the past (some MUD code in C++) where, inevitably, someone will want to make the 'real' default adapter for interface X; then someone else will want to make the 'really real' default adapter. Different developers will eventually keep trying to write comparison methods that leapfrog each other backwards to get to the correct result for last-most-from-greater-than-everything, which turns into a bug-ridden mess (and it's never really clear who should be "winning" this race to be the final overrider anyway).
Right, the proper solution is to: 1) have one protocol per use-case 2) don't reuse a protocol for other use cases that aren't an exact match 3) use transitive adaptation so that similar use cases can reuse adapters, while still allowing special cases to declare a direct adapter that overrides the transitive one So far, this strategy has worked out very well for me, without need for explicit depth declaration. By the way, though, I don't know what you mean by "default adapter". Do you mean the adapter for type 'object', perhaps? I can't imagine why somebody would care about that, though.
More importantly I don't really understand if that's in fact what the 'depth' value is used for, because my eyes glaze over halfway through the above web page :) PyProtocols feels to me like it's gone out of even the upper levels of abstraction that the Twisted team is used to inhabiting, straight into the Zopeosphere... 4000 lines of code related to components, whereas t.p.components has 300? It worries me.
Sigh. I frequently regret having undertaken to document PyProtocols so thoroughly. :( Ironically, I did so thinking it would encourage developers of other frameworks to come on board. ;) Seriously, though, the only major differences I know of between PyProtocols and Twisted's interfaces are: 1) Transitive adaptation is automatic 2) Instances may implement interfaces, and can participate in their adaptation 3) Interface declarations are inherited from all base classes, not just the first So, the "upper levels of abstraction" have solely to do with levels that you don't need to know about in order to simply *use* the system. (See also http://peak.telecommunity.com/protocol_ref/module-protocols.twistedsupport.h... for some of the minor details of current PyProtocols/Twisted compatibility. But anyway, where the heck are you getting 4000 lines from? pje@pje ~/PyProtocols $ wc -l src/protocols/*.py 9 src/protocols/__init__.py 205 src/protocols/adapters.py # adapter bases, adapter arithmetic, default adapters 369 src/protocols/advice.py # stuff to support interface declarations 287 src/protocols/api.py # API methods 246 src/protocols/classic.py # support for declaring interfaces on Python built-ins 328 src/protocols/generate.py # auto-generated interfaces, like protocolForURI 410 src/protocols/interfaces.py # the actual Protocol/Interface implementations, and interfaces for them 205 src/protocols/twisted_support.py # This would go away, or at least get shorter... :) 121 src/protocols/zope_support.py # But we're probably stuck with this. :) 2180 total And those files have lots of whitespace and documentation lines in 'em. Maybe you're including the tests?
Maybe I'm alone in these concerns, though. Does anyone else feel that depending on PyProtocols would increase the learning curve for Twisted even more? Or is this common knowledge in the Python community that could be leveraged to actually make the curve shallower? I can certainly learn to wrap my head around the whole thing if nobody else has trouble with it :)
Stop trying to understand it and just use it. ;) Seriously, though, I think that Twisted's Interface/Adapter How-To is the kind of documentation I *should* have written for PyProtocols. The PyProtocols docs were biased towards proving that its framework is consistent and useful for all sorts of tricky edge cases and advanced interface usages, instead of just saying, "here, this is what you can do". In particular, I wanted to show Jim Fulton that adaptation is more fundamental than interface implementation, because you can represent the latter as a special case of the former. (i.e., the NO_ADAPTER_NEEDED adapter.) So, as you can see right there, writing docs with Jim Fulton in mind as the intended audience is where I made my big mistake. :)
There might be some other issues that could come up, but I'm definitely willing to try to "adapt" to Twisted's needs in these areas, especially if it means I could get rid of PyProtocols' wrapper code and wrapping tests for Twisted's existing interface class. :)
This is clearly something that we need to talk about more. As many silly disagreements about design as I can come up with, a common components system would be beneficial to everyone involved. Are you coming to PyCon? :)
No, but I have an IRC client. :)