[Python-Dev] Sneaky 'super' instances

Samuele Pedroni pedronis@bluewin.ch
Thu, 12 Jun 2003 17:35:41 +0200


At 08:55 12.06.2003 -0400, Phillip J. Eby wrote:
>While it's true that you can't use 'adapt(ob,file)' meaningfully, there's 
>nothing stopping you from saying 'declareImplementation(file,[IFile])' and 
>'adapt(ob,IFile)'.  (And the user can say 
>'adviseObject(ob,provides=[IFile]) before passing you 'ob'.)  The point of 
>the open protocol mechanism is that you can make these declarations 
>relative to a specific package's interfaces, *without* requiring there to 
>be a common standard.  (Because you can define the relationships between 
>different packages' interfaces.)

The problem is that especially in the context of the std library or simple 
Python usage vs. complex frameworks and apps or software engineering heavy 
situations in general, what I would call micro protocol are quite often used:

if IFile means to have all the methods of file is probably too much, and 
even if there are adpaters that can complete very partial implementations. 
For really general adoption even asking people to declare things for quick 
scripts is too much.

Consider this fragment:

   if hasattr(obj,'read'):
       ... use obj.read ...
   elif if hasattr(obj,'value'):
       ...

if I want to rewrite it using adapt:

aobj = adapt(obj,?)

so? IFile is too much, inventing names for all micro subsets of IFile seems 
also rather problematic. Especially considering the client situation:

import ...protocol stuff... # boilerplate

class MyObj:

   def read():
     ...

declareImplementation(MyObj,[?]) # boilerplate

So we have the conflicting needs of libraries that want to use and support 
adaptation even for micro-protocols and provide the gained flexibility and 
of  users that want to use them (the std lib could be such a lib) for quick 
scripts, and hate boilerplate. And we have the taxonomy issue.

Once I already sketched in a mail to the list a possible solution to this 
problem. I have read a bit PyProtocols documentation and I think that my 
idea is probably implementatble on top of PyProtocols.

The idea is to allow for anonymous subsetting of protocols:

subproto(IFile,'read'[,...])

there would be a cache to guarantee uniqueness of these things.
Given that in

subproto(Protocol, method-names)

len(method-names) would be small (<5), even if there exponatial many 
subsets of method-names :) it should be tractable to automatically register 
all the natural relationships:
subproto(IFile,'read','write') > subproto(IFile,'read')

Obviously these anonymous sub-protocols would be instantiated on demand, 
not when IFile is created.

The final point then is that there should be a [global?] flag to choose 
between strict or loose checking for this protocols:

ReadProto=subproto(IFile,'read')

aobj = adapt(obj,ReadProto)

in loose mode obj would directly adapt to ReadProto if it has a 'read' 
attribute.
In strict mode some adaption path should have been registered explicitly: e.g.:

class MyObj:

   def read():
     ...

declareImplementation(MyObj,[subproto(IFile,'read')])

regards.