[Python-Dev] "Monkey Typing" pre-PEP, partial draft

Phillip J. Eby pje at telecommunity.com
Sun Jan 16 05:51:59 CET 2005


This is only a partial first draft, but the Motivation section nonetheless 
attempts to briefly summarize huge portions of the various discussions 
regarding adaptation, and to coin a hopefully more useful terminology than 
some of our older working adjectives like "sticky" and "stateless" and 
such.  And the specification gets as far as defining a simple 
decorator-based syntax for creating operational (prev. "stateless") and 
extension (prev. "per-object stateful") adapters.

I stopped when I got to the API for declaring volatile (prev. per-adapter 
stateful) adapters, and for enabling them to be used with type 
declarations, because Clark's post on his revisions-in-progress seem to 
indicate that this can probably be handled within the scope of PEP 246 
itself.  As such, this PEP should then be viewed more as an attempt to 
formulate how "intrinsic" adapters can be defined in Python code, without 
the need to manually create adapter classes for the majority of 
type-compatibility and "extension" use cases.  In other words, the 
implementation described herein could probably become part of the front-end 
for the PEP 246 adapter registry.

Feedback and corrections (e.g. if I've repeated myself somewhere, spelling, 
etc.) would be greatly appreciated.  This uses ReST markup heavily, so if 
you'd prefer to read an HTML version, please see:

http://peak.telecommunity.com/DevCenter/MonkeyTyping

But I'd prefer that corrections/discussion quote the relevant section so I 
know what parts you're talking about.  Also, if you find a place where a 
more concrete example would be helpful, please consider submitting one that 
I can add.  Thanks!


PEP: XXX
Title: "Monkey Typing" for Agile Type Declarations
Version: $Revision: X.XX $
Last-Modified: $Date: 2003/09/22 04:51:50 $
Author: Phillip J. Eby <pje at telecommunity.com>
Status: Draft
Type: Standards Track
Python-Version: 2.5
Content-Type: text/x-rst
Created: 15-Jan-2005
Post-History: 15-Jan-2005


Abstract
========

Python has always had "duck typing": a way of implicitly defining types by
the methods an object provides.  The name comes from the saying, "if it walks
like a duck and quacks like a duck, it must *be* a duck".  Duck typing has
enormous practical benefits for small and prototype systems.  For very large
frameworks, however, or applications that comprise multiple frameworks, some
limitations of duck typing can begin to show.

This PEP proposes an extension to "duck typing" called "monkey typing", that
preserves most of the benefits of duck typing, while adding new features to
enhance inter-library and inter-framework compatibility.  The name comes from
the saying, "Monkey see, monkey do", because monkey typing works by stating
how one object type may *mimic* specific behaviors of another object type.

Monkey typing can also potentially form the basis for more sophisticated type
analysis and improved program performance, as it is essentially a simplified
form of concepts that are also found in languages like Dylan and Haskell.  It
is also a straightforward extension of Java casting and COM's QueryInterface,
which should make it easier to represent those type systems' behaviors within
Python as well.


Motivation
==========

Many interface and static type declaration mechanisms have been proposed for
Python over the years, but few have met with great success.  As Guido has
said recently [1]_:

     One of my hesitations about adding adapt() and interfaces to the core
     language has always been that it would change the "flavor" of much of
     the Python programming we do and that we'd have to relearn how to
     write good code.

Even for widely-used Python interface systems (such as the one provided by
Zope), interfaces and adapters seem to require this change in "flavor", and
can require a fair amount of learning in order to use them well and avoid
various potential pitfalls inherent in their use.

Thus, spurred by a discussion on PEP 246 and its possible use for optional
type declarations in Python [2]_, this PEP is an attempt to propose a semantic
basis for optional type declarations that retains the "flavor" of Python, and
prevents users from having to "relearn how to write good code" in order to use
the new features successfully.

Of course, given the number of previous failed attempts to create a type
declaration system for Python, this PEP is an act of extreme optimism, and it
will not be altogether surprising if it, too, ultimately fails.  However, if
only because the record of its failure will be useful to the community, it is
worth at least making an attempt.  (It would also not be altogether surprising
if this PEP results in the ironic twist of convincing Guido not to include
type declarations in Python at all!)

Although this PEP will attempt to make adaptation easy, safe, and flexible,
the discussion of *how* it will do that must necessarily delve into many
detailed aspects of different use cases for adaptation, and the possible
pitfalls thereof.

It's important to understand, however, that developers do *not* need to
understand more than a tiny fraction of what is in this PEP, in order to
effectively use the features it proposes.  Otherwise, you may gain the
impression that this proposal is overly complex for the benefits it
provides, even though virtually none of that complexity is visible to the
developer making use of the proposed facilities.  That is, the value of this
PEP's implementation lies in how much of this PEP will *not* need to be thought
about by a developer using it!

Therefore, if you would prefer an uncorrupted "developer first impression"
of the proposal, please skip the remainder of this Motivation and proceed
directly to the `Specification`_ section, which presents the usage and
implementation.  However, if you've been involved in the Python-Dev discussion
regarding PEP 246, you probably already know too much about the subject to have
an uncorrupted first impression, so you should instead read the rest of this
Motivation and check that I have not misrepresented your point of view before
proceeding to the Specification.  :)


Why Adaptation for Type Declarations?
-------------------------------------

As Guido acknowledged in his optional static typing proposals, having type
declarations check argument types based purely on concrete type or conformance
to interfaces would stifle much of Python's agility and flexibility.  However,
if type declarations are used instead to *adapt* objects to an interface
expected by the receiver, Python's flexibility could in fact be *improved* by
type declarations.

PEP 246 presents a basic implementation model for automatically finding an
appropriate adapter so that one type can conform to the interface of another.
However, in recent discussions on the Python developers' mailing list, it
came out that there were many open issues about what sort of adapters would
be useful (or dangerous) in the context of type declarations.

PEP 246 was originally proposed for an explicit adaptation model where an
``adapt()`` function is called to retrieve an "adapter".  So, in this model the
adapting code potentially has access to both the "original" object and the
adapted version of the object.  Also, PEP 246 permitted either the caller of a
function or the called function to perform the adaptation, meaning that the
scope and lifetime of the resulting adapter could be explicitly controlled in a
straightforward way.

By contrast, type declarations would perform adaptation at the boundary between
caller and callee, making it impossible for the caller to control the adapter's
lifetime, or for the callee to obtain the "original" object.

Many options for reducing or controlling these effects were discussed.  By and
large, it is possible for an adapter author to address these issues with due
care and attention.  However, it also became clear from the discussion that
most persons new to the use of adaptation are often eager to use it for things
that lead rather directly to potentially problematic adapter behaviors.

Also, by the very nature of ubiquitous adaptation via type declarations, these
potentially problematic behaviors can spread throughout a program, and just
because one developer did not create a problematic adaptation, it does not mean
he or she will be immune to the effects of those created by others.

So, rather than attempt to make all possible Python developers "relearn how to
write good code", this PEP seeks to make the safer forms of adaptation easier
to learn and use than the less-safe forms.  (Which is the reverse of the
current situation, where less-safe adapters are often easier to write
than some safer ones!)



Kinds of Adaptation
-------------------

Specifically, the three forms of type adaptation we will discuss here are:

Operational Conformance
     Providing operations required by a target interface, using the
     operations and state available of the adapted type.  This is the
     simplest category of adaptation, because it introduces no new
     state information.  It is simply a specification of how an instance
     of one type can be adapted to act as if it were an instance of
     another type.

Extension/Extender
     The same as operational conformance, but with additional required state.
     This extra state, however, "belongs" to the original object, in the sense
     that it should exist as long as the original object exists.  An extension,
     in other words, is intended to extend the capabilities of the original
     object when needed, not to be an independently created object with its
     own lifetime.  Each time an adapter is requested for the target interface,
     an extension instance with the "same" state should be returned.

Volatile/View/Accessory
     Volatile adapters are used to provide functionality that may require
     multiple independent adapters for the same adapted object.  For example,
     a "view" in a model-view-controller (MVC) framework can be seen as a
     volatile adapter on a model, because more than one view may exist for the
     same model, with each view having its own independent state (such as 
window
     position, etc.).

Volatile adaptation is not an ideal match for type declaration, because it
is often important to explicitly control when each new volatile adapter is
created, and to whom it is being passed.  For example, in an MVC framework
one would not normally wish to pass a model to methods expecting views,
and wind up having new views created (e.g. windows opened) automatically!

Naturally, there *are* cases where opening a new window for some model object
*is* what you want.  However, using an implicit adaptation (via type
declaration) also means that passing a model to *any* method expecting a view
would result in this happening.  So, it is generally better to have the methods
that desire this behavior explicitly request it, e.g. by calling the PEP 246
``adapt()`` function, rather than having it happen implicitly by way of a type
declaration.

So, this PEP seeks to:

1. Make it easy to define operational and extension adapters

2. Make it possible to define volatile adapters, but only by explicitly
    declaring them as such in the adapter's definition.

3. Make it possible to have a type declaration result in creation of a
    volatile adapter, but only by explicitly declaring in the adapter's
    definition that type declarations are allowed to implicitly create
    instances.

By doing this, the language can gently steer developers away from
unintentionally creating adapters whose implicit behavior is difficult to
understand, or is not as they intended, by making it easier to do safer
forms of adaptation, and suggesting (via declaration requirements) that other
forms may need a bit more thought to use correctly.


Adapter Composition
-------------------

One other issue that was discussed heavily on Python-Dev regarding PEP 246
was adapter composition.  That is, adapting an already-adapted object.
Many people spoke out against implicit adapter composition (which was referred
to as transitive adaptation), because it introduces potentially unpredictable
emergent behavior.  That is, a local change to a program could have unintended
effects at a more global scale.

Using adaptation for type declarations can produce unintended adapter
composition.  Take this code, for example::

     def foo(bar: Baz):
         whack(bar)

     def whack(ping: Whee):
         ping.pong()

If a ``Baz`` instance is passed to ``foo()``, it is not wrapped in an adapter,
but is then passed to ``whack()``, which must then adapt it to the ``Whee``
type.  However, if an instance of a different type is passed to ``foo()``, then
``foo()`` will receive an adapter to make that object act like a ``Baz``
instance.  This adapter is then passed to ``whack()``, which further adapts
it to a ``Whee`` instance, thereby composing a second adapter onto the first,
or perhaps failing with a type error because there is no adapter available
to adapt the already-adapted object.  (There can be other side effects as well,
such as when attempting to compare implicitly adapted objects or use them as
dictionary keys.)

Therefore, this proposal seeks to have adaptation performed via type
declarations avoid implicit adapter composition, by never adapting an
operational or extension adapter.  Instead, the original object will be
retrieved from the adapter, and then adapted to the new target interface.

Volatile adapters, however, are independent objects from the object they
adapt, so they must always be considered an "original object" in their own
right.  (So, volatile adapters are also more volatile than other adapters with
respect to transitive adaptation.)  However, since volatile adapters
must be declared as such, and require an additional declaration to allow them
to be implicitly created, the developer at least has some warning that their
behavior will be more difficult to predict in the presence of type
declarations.


Interfaces vs. Duck Typing
--------------------------

An "interface" is generally recognized as a collection of operations that an
object may perform, or that may be performed on it.  Type declarations are then
used in many languages to indicate what interface is required of an object
that is supplied to a routine, or what interface is provided by the routine's
return value(s).

The problem with this concept is that interface implementations are typically
expected to be complete.  In Java, for example, you say that your class
implements an interface unless you actually add all of the required methods,
even if some of them aren't needed in your program yet.

A second problem with this is that incompatible interfaces tend to proliferate
among libraries and frameworks, even when they deal with the same basic
concepts and operations.  Just the fact that people might choose different
names for otherwise-identical operations makes it considerably less likely that
two interfaces will be compatible with each other!

There are two missing things here:

1. Just because you want to have an object of a given type (interface) doesn't
    mean you will use all possible operations on it.

2. It'd be really nice to be able to map operations from one interface onto
    another, without having to write wrapper classes and possibly having to
    write dummy implementations for operations you don't need, and perhaps 
can't
    even implement at all!

On the other hand, the *idea* of an interface as a collection of operations
isn't a bad idea.  And if you're the one *using* the interface's operations,
it's a convenient way to do it.  This proposal seeks to retain this useful
property, while ditching much of the "baggage" that otherwise comes with it.

What we would like to do, then, is allow any object that can perform operations
"like" those of a target interface, to be used as if it were an object of the
type that the interface suggests.

As an example, consider the notion of a "file-like" object, which is often
referred to in the discussion of Python programs.  It basically means, "an
object that has methods whose semantics roughly correspond to the same-named
methods of the built-in ``file`` type."

It does *not* mean that the object must be an instance of a subclass of
``file``, or that it must be of a class that declares it "implements the
``file`` interface".  It simply means that the object's *namespace* mirrors the
*meaning* of a ``file`` instance's namespace.  In a phrase, it is
"duck typing": if it walks like a duck and quacks like a duck, it must *be*
a duck.

Traditional interface systems, however, rapidly break down when you attempt
to apply them to this concept.  One repeatedly used measuring stick for
proposed Python interface systems has been, "How do I say I want a file-like
object?"  To date, no proposed interface system for Python (that this author
knows about, anyway) has had a good answer for this question, because they
have all been based on completely implementing the operations defined by an
interface object, distinct from the concrete ``file`` type.

Note, however, that this alienation between "file-like" interfaces and the
``file`` type, leads to a proliferation of incompatible interfaces being
created by different packages, each declaring a different subset of the total
operations provided by the ``file`` type.  This then leads further to the need
to somehow reconcile the incompatibilities between these diverse interfaces.

Therefore, in this proposal we will turn both of those assumptions upside down,
by proposing to declare conformance to *individual operations* of a target
type, whether the type is concrete or abstract.  That is, one may define the
notion of "file-like" without reference to any interface at all, by simply
declaring that certain operations on an object are "like" the operations
provided by the ``file`` type.

This idea will (hopefully) better match the uncorrupted intuition of a Python
programmer who has not yet adopted traditional static interface concepts, or
of a Python programmer who rebels against the limitations of those concepts (as
many Python developers do).  And, the approach corresponds fairly closely to
concepts in other languages with more sophisticated type systems (like Haskell
typeclasses or Dylan protocols), while still being a straightforward extension
of more rigid type systems like those of Java or Microsoft's COM (Component
Object Model).

This PEP directly competes with PEP 245, which proposes a syntax for
Python interfaces.  If some form of this proposal is accepted, it
would be unnecessary for a special interface type or syntax to be added to
Python, since normal classes and partially or completely abstract classes will
be routinely usable as interfaces.  Some packages or frameworks, of course, may
have additional requirements for interface features, but they can use
metaclasses to implement such enhanced interfaces without impeding their
ability to be used as interfaces by this PEP's adaptation system.


Specification
=============

For "file-like" objects, the standard library already has a type which may
form the basis for compatible interfacing between packages; if each package
denotes the relationship between its types' operations and the operations
of the ``file`` type, then those packages can accept other packages' objects
as parameters declared as requiring a ``file`` instance.

However, the standard library cannot contain base versions of all possible
operations for which multiple implementations might exist, so different
packages are bound to create different renderings of the same basic operations.
For example, one package's ``Duck`` class might have ``walk()`` and ``quack()``
methods, where another package might have a ``Mallard`` class (a kind of duck)
with ``waddle()`` and ``honk()`` methods.  And perhaps another package might
have a class with ``moveLeftLeg()`` and ``moveRightLeg()`` methods that must
be combined in order to offer an operation equivalent to ``Duck.walk()``.

Assuming that the package containing ``Duck`` has a function like this (using
Guido's proposed optional typing syntax [2]_)::

     def walkTheDuck(duck: Duck):
         duck.walk()

This function expects a ``Duck`` instance, but what if we wish to use a
``Mallard`` from the other package?

The simple answer is to allow Python programs to explicitly state that an
operation (i.e. function or method) of one type has semantics that roughly
correspond to those of an operation possessed by a different type.  That is,
we want to be able to say that ``Mallard.waddle()`` is "like" the method
``Duck.walk()``.  (For our examples, we'll use decorators to declare this
"like"-ness, but of course Python's syntax could also be extended if desired.)

If we are the author of the ``Mallard`` class, we can declare our compatibility
like this::

     class Mallard(Waterfowl):

         @like(Duck.walk)
         def waddle(self):
             # walk like a duck!

This is an example of declaring the similarity *inside* the class to be
adapted.  In many cases, however, you can't do this because you don't control
the implementation of the class you want to use, or even if you do, you don't
wish to introduce a dependency on the foreign package.

In that case, you can create what we'll call an "external operation", which
is just a function that's declared outside the class it applies to.  It's
almost identical to the "internal operation" we declared inside the ``Mallard``
class, but it has to call the ``waddle()`` method, since it doesn't also
implement waddling::

     @like(Duck.walk, for_type=Mallard)
     def duckwalk_by_waddling(self):
         self.waddle()

Whichever way the operation correspondence is registered, we should now be
able to successfully call ``walkTheDuck(Mallard())``.  Python will then
automatically create a "proxy" or "adapter" object that wraps the ``Mallard``
instance with a ``Duck``-like interface.  That adapter will have a ``walk()``
method that is just a renamed version of the ``Mallard`` instance's
``waddle()`` method (or of the ``duckwalk_by_waddling`` external operation).

For any methods of ``Duck`` that have no corresponding ``Mallard`` operation,
the adapter will omit that attribute, thereby maintaining backward
compatibility with code that uses attribute introspection or traps
``AttributeError`` to control optional behaviors.  In other words, if we have
a ``MuteMallard`` class that has no ability to ``quack()``, but has an
operation corresponding to ``walk()``, we can still safely pass its instances
to ``walkTheDuck()``, but if we pass a ``MuteMallard`` to a routine that
tries to make it ``quack``, that routine will get an ``AttributeError``.


Adapter Creation
----------------

Note, however, that even though a different adapter class is needed for
different adapted types, it is not necessary to create an adapter class "from
scratch" every time a ``Mallard`` is used as a ``Duck``.  Instead, the
implementation can need only create a ``MallardAsDuck`` adapter class once, and
then cache it for repeated uses.  Adapter instances can also be quite small in
size, because in the general case they only need to contain a reference to the
object instance that they are adapting.  (Except for "extension" adapters,
which need storage for their added "state" attributes.  More on this later,
in the section on `Adapters That Extend`_, below.)

In order to be able to create these adapter classes, we need to be able to
determine the correspondence between the target ``Duck`` operations, and
operations for a ``Mallard``.  This is done by traversing the ``Duck``
operation namespace, and retrieving methods and attribute descriptors.  These
descriptors are then looked up in a registry keyed by descriptor (method or
property) and source type (``Mallard``).  The found operation is then placed
in the adapter class' namespace under the name given to it by the ``Duck``
type.

So, as we go through the ``Duck`` methods, we find a ``walk()`` method
descriptor, and we look into a registry for the key ``(Duck.walk,Mallard)``.
(Note that this is keyed by the actual ``Duck.walk`` method, not by the *name*
``"Duck.walk"``.  This means that an operation inherited unchanged by a
subclass of ``Duck`` can reuse operations declared "like" that operation.)

If we find the entry, ``duckwalk_by_waddling`` (the function object, not its
name), then we simply place that object in the adapter class' dictionary under
the name ``"walk"``, wrapped in a descriptor that substitutes the original
object as the method's ``self`` parameter.  Thus, when the function is invoked
via an adapter instance's ``walk()`` method, it will receive the adapted
``Mallard`` as its ``self``, and thus be able to call the ``waddle()``
operation.

However, operations declared in a class work somewhat differently.  If
we directly declared that ``waddle()`` is "like" ``Duck.walk`` in the body
of the ``Mallard`` class, then the ``@like`` decorator will register the method
name ``"waddle"`` as the operation in the registry.  So, we would then look up
that name on the source type in order to implement the operation on the
adapter.  For the ``Mallard`` class, this doesn't make any difference, but if
we were adapting a subclass of ``Mallard`` this would allow us to pick up the
subclass' implementation of ``waddle()`` instead.

So, we have our ``walk()`` method, so now let's add a ``quack()`` method.
But wait, we haven't declared one for ``Mallard``, so there's no entry for
``(Duck.quack,Mallard)`` in our registry.  So, we proceed through
the ``__mro__`` (method resolution order) of ``Mallard`` in order to see if
there is an operation corresponding to ``quack`` that ``Mallard`` inherited
from one of its base classes.  If no method is found, we simply do not put
anything in the adapter class for a ``"quack"`` method, which will cause
an ``AttributeError`` if somebody tries to call it.

Finally, if our attempt at creating an adapter winds up having *no* operations
specific to the ``Duck`` type, then a ``TypeError`` is raised.  Thus if we had
passed an instance of ``Pig`` to the ``walkTheDuck`` function, and ``Pig``
had no methods corresponding to any ``Duck`` methods, this would result in
a ``TypeError`` -- even if the ``Pig`` type has a method named ``walk()``! --
because we haven't said anywhere that a pig walks like a duck.

Of course, if all we wanted was for ``walkTheDuck`` to accept any object
with a method *named* ``walk()``, we could've left off the type declaration
in the first place!  The purpose of the type declaration is to say that we
*only* want objects that claim to walk like ducks, assuming that they walk
at all.

This approach is not perfect, of course.  If we passed in a ``LeglessDuck``
to ``walkTheDuck()``, it is not going to work, even though it will pass the
``Duck`` type check (because it can still ``quack()`` like a ``Duck``).
However, as with normal Python "duck typing", it suffices to run the program
to find that error.  The key here is that type declarations should facilitate
using *different* objects, perhaps provided by other authors following
different naming conventions or using different operation granularities.


Inheritance
-----------

By default, this system assumes that subclasses are "substitutable" for their
base classes.  That is, we assume that a method of a given name in a subclass
is "like" (i.e. is substitutable for) the correspondingly-named method in a
base class.  However, sometimes this is *not* the case; a subclass may have
stricter requirements on routine parameters.  For example, suppose we have
a ``Mallard`` subclass like this one::

     class SpeedyMallard(Mallard):
         def waddle(self, speed):
             # waddle at given speed

This class is *not* substitutable for Mallard, because it requires an extra
parameter for the ``waddle()`` method.  In this case, the system should *not*
consider ``SpeedyMallard.waddle`` to be "like" ``Mallard.waddle``, and it
therefore should not be usable as a ``Duck.walk`` operation.  In other words,
when inheriting an operation definition from a base class, the subclass'
operation signature must be checked against that of the base class, and
rejected if it is not compatible.  (Where "compatible" means that the subclass
method will accept as many arguments as the base class method will, and that
any extra arguments taken by the subclass method are optional ones.)

Note that Python cannot tell, however, if a subclass changes the *meaning*
of an operation, without changing its name or signature.  Doing so is arguably
bad style, of course, but it could easily be supported anyway by using an
additional decorator, perhaps something like ``@unlike(Mallard.waddle)`` to
claim that no operation correspondences should remain, or perhaps
``@unlike(Duck.walk)`` to indicate that only that operation no longer applies.

In any case, when a substitutability error like this occurs, it should ideally
give the developer an error message that explains what is happening, perhaps
something like "waddle() signature changed in class Mallard, but replacement
operation for Duck.walk has not been defined."  This error can then be
silenced with an explicit ``@unlike`` decorator (or by a standalone ``unlike``
call if the class cannot be changed).


External Operations and Method Dependencies
-------------------------------------------

So far, we've been dealing only with simple examples of method renaming, so
let's now look at more complex integration needs.  For example, the Python
``dict`` type allows you to set one item at a time (using ``__setitem__``) or
to set multiple items using ``update()``.  If you have an object that you'd
like to pass to a routine accepting "dictionary-like" objects, what if your
object only has a ``__setitem__`` operation but the routine wants to use
``update()``?

As you may recall, we follow the source type's ``__mro__`` to look for an
operation inherited possibly "inherited" from a base class.  This means that
it's possible to register an "external operation" under
``(dict.update,object)`` that implements a dictionary-like ``update()`` method
by repeatedly calling ``__setitem__``.  We can do so like this::

     @like(dict.update, for_type=object, needs=[dict.__setitem__])
     def do_update(self:dict, other:dict):
         for key,value in other.items():
             self[key] = value

Thus, if a given type doesn't have a more specific implementation of
``dict.update``, then types that implement a ``dict.__setitem__`` method can
automatically have this ``update()`` method added to their ``dict`` adapter
class.  While building the adapter class, we simply keep track of the needed
operations, and remove any operations with unmet or circular dependencies.

By the way, even though technically the ``needs`` argument to ``@like`` could
be omitted since the information is present in the method body, it's actually
helpful for documentation purposes to present the external operation's
requirements up-front.

However, if the programmer fails to accurately state the method's needs, the
result will either be an ``AttributeError`` at a deeper point in the code, or
a stack overflow exception caused by looping between mutually recursive
operations.  (E.g. if an external ``dict.__setitem__`` is defined in terms of
``dict.update``, and a particular adapted type supports neither operation
directly.)  Neither of these ways of revealing the error is particularly
problematic, and is easily fixed when discovered, so ``needs`` is still
intended more for the reader of the code than for the adaptation system.

By the way, if we look again at one of our earliest examples, where we
externally declared a method correspondence from ``Mallard.waddle`` to
``Duck.walk``::

     @like(Duck.walk, for_type=Mallard)
     def walk_like_a_duck(self):
         self.waddle()

we can see that this is actually an external operation being declared; it's
just that we didn't give the (optional) full declarations::

     @like(Duck.walk, for_type=Mallard, needs=[Mallard.waddle])
     def walk_like_a_duck(self:Mallard):
         self.waddle()

When you register an external operation, the actual function object given is
registered, because the operation doesn't correspond to a method on the
adapted type.  In contrast, "internal operations" declared within the adapted
type cause the method *name* to be registered, so that subclasses can inherit
the "likeness" of the base class' methods.


Adapters That Extend
--------------------

One big difference between external operations and ones created within a
class, is that a class' internal operations can easily add extra attributes
if needed.  An external operation, however, is not in a good position to do
that.  It *could* just stick additional attributes onto the original
object, but this would be considered bad style at best, even if it used
mangled attribute names to avoid collisions with other external
operations' attributes.

So let's look at an example of how to handle adaptation that needs more
state information than is available in the adapted object.  Suppose, for
example, we have a new ``DuckDodgers`` class, representing a duck who is
also a test pilot.  He can therefore be used as a rocket-powered vehicle by
strapping on a ``JetPack``, which we can have happen automatically::

     @like(Rocket.launch, for_type=DuckDodgers, using=JetPack)
     def launch(jetpack, self):
         jetpack.activate()
         print "Up, up, and away!"

The type given as the ``using`` parameter must be instantiable without
arguments.  That is, ``JetPack()`` must create a valid instance.  When
a ``DuckDodgers`` instance is being used as a ``Rocket`` instance, and this
``launch`` method is invoked, it will attempt to create a ``JetPack``
instance for the ``DuckDodgers`` instance (if one has not already been
created and cached).

The same ``JetPack`` will be used for all external operations that request to
use a ``JetPack`` for that specific ``DuckDodgers`` instance.  (Which only
makes sense, because Dodgers can wear only one jet pack at a time, and adding
more jet packs will not allow him to fly to several places at once!)

It's also necessary to keep reusing the *same* ``JetPack`` instance for a
given ``DuckDodgers`` instance, even if it is adapted many times to different
rocketry-related interfaces.  Otherwise, we might create a new ``JetPack``
during flight, which would then be confused about how much fuel it had or
whether it was currently in flight!

This pattern of adaptation is referred to in the `Motivation`_ section as
"extension" or "extender" adaptation, because it allows you to dynamically
extend the capabilities of an existing class at runtime, as opposed to just
recasting its existing operations in a form that's compatible with another
type.  In this case, the ``JetPack`` is the extension, and our ``launch``
method defines part of the adapter.

Note, by the way that ``JetPack`` is a completely independent class here.  It
does not have to know anything about ``DuckDodgers`` or its use as an adapter,
nor does ``DuckDodgers`` need to know about ``JetPack``.  In fact, neither
object should be given a reference to the other, or this will create a
circularity that may be difficult to garbage collect.  Python's adaptation
machinery will use a weak-key dictionary mapping from adapted objects to their
"extensions", so that our ``JetPack`` instance will hang around until the
associated ``DuckDodgers`` instance goes away.

Then, when external operations using ``JetPack`` are invoked, they simply
request a ``JetPack`` instance from this dictionary, for the given
``DuckDodgers`` instance, and then the operation is invoked with references
to both objects.

Of course, this mechanism is not available for adapting types whose instances
cannot be weak-referenced, such as strings and integers.  If you need to extend
such a type, you must fall back to using a volatile adapter, even if you would
prefer to have a state that remains consistent across adaptations.  (See the
`Volatile Adaptation`_ section below.)


Using Multiple Extenders
------------------------

Each external operation can use a different ``using`` type to store its state.
For example, a ``DuckDodgers`` instance might be able to be used as a
``Soldier``, provided that he has a ``RayGun``::

     @like(Soldier.fight, for_type=DuckDodgers, using=RayGun)
     def fight(raygun, self, enemy:Martian):
         while enemy.isAlive():
             raygun.fireAt(enemy)

In the event that two operations covering a given ``for_type`` type have
``using`` types with a common base class (other than ``object``), the
most-derived type is used for both operations.  This rule ensures that
extenders do not end up with more than one copy of the same state, divided
between a base type and a derived type.

Notice that our examples of ``using=JetPack`` and ``using=RayGun`` do not
interact, as long as ``RayGun`` and ``JetPack`` do not share a common base
class other than ``object``.  However, if we had defined one operation
``using=JetPack`` and another as ``using=HypersonicJetPack``, then both
operations would receive a ``HypersonicJetPack`` if ``HypersonicJetPack`` is
a subclass of ``JetPack``.  This ensures that we don't end up with two jet
packs, but instead use the best jetpack possible for the operations we're
going to perform.

However, if we *also* have an operation using a ``BrokenJetPack``, and that's
also a subclass of ``JetPack``, then we have a conflict, because there's no
way to reconcile a ``HypersonicJetPack`` with a ``BrokenJetPack``, without
first creating a ``BrokenHypersonicJetPack`` that derives from both, and using
it in at least one of the operations.

If it is not possible to determine a single "most-derived" type among a set of
operations for a given adapted type, then an error is raised, similar to that
raised by when deriving a class from classes with incompatible metaclasses.
As with that kind of error, this error can be resolved just by adding another
``using`` type that inherits from the conflicting types.


Volatile Adaptation
-------------------

Volatile adapters are not the same thing as operational adapters or extenders.
Indeed, some strongly question whether they should be called "adapters" at all,
because to do so weakens the term.  For example, in the model-view-controller
pattern, does it make sense to call a view an "adapter"?  What about iterators?
Are they "adapters", too?  At some point, one is reduced to calling any object
an adapter, as long as it mainly performs operations on one other object.  This
seems like a questionable practice, and it's a much broader term than is used
in the context of the GoF "Adapter Pattern" [3]_.

Indeed, it could be argued that these other "adapters" are actually extensions
of the GoF "Abstract Factory" pattern [4]_.  An Abstract Factory is a way
of creating an object whose interface is known, but whose concrete type is not.
PEP 246 adaptation can basically be viewed as an all-purpose Abstract Factory
that takes a source object and a destination interface.  This is a valuable
tool for many purposes, but it is not really the same thing as adaptation.

Shortly after I began writing this section, Clark Evans posted a request for
feedback on changes to PEP 246, that suggests PEP 246 will provide adequate
solutions of its own for defining volatile adapters, including options for
declaring an adapter volatile, and whether it is safe for use with type
declarations.  So, for now, this PEP will assume that volatile adapters will
fall strictly under the jurisdiction of PEP 246, leaving this PEP to deal
only with the previously-covered styles of adaptation that are by definition
safe for use with type declarations.  (Because they only cast an object in
a different role, rather than creating an independent object.)


Miscellaneous
-------------

XXX property get/set/del as three "operations"

XXX binary operators

XXX level-confusing operators: comparison, repr/str, equality/hashing

XXX other special methods


Backward Compatibility
======================

XXX explain Java cast and COM QueryInterface as proper subsets of adaptation


Reference Implementation
========================

TODO


Acknowledgments
===============

Many thanks to Alex Martelli, Clark Evans, and the many others who participated
in the Great Adaptation Debate of 2005.  Special thanks also go to folks like
Ian Bicking, Paramjit Oberoi, Steven Bethard, Carlos Ribeiro, Glyph Lefkowitz
and others whose brief comments in a single message sometimes provided more
insight than could be found in a megabyte or two of debate between myself
and Alex; this PEP would not have been possible without all of your input.
Last, but not least, Ka-Ping Yee is to be thanked for pushing the idea of
"partially abstract" interfaces, for which idea I have here attempted to
specify a practical implementation.

Oh, and finally, an extra special thanks to Guido for not banning me from
the Python-Dev list when Alex and I were posting megabytes of
adapter-related discussion each day.  ;)


References
==========

.. [1] Guido's Python-Dev posting on "PEP 246: lossless and stateless"
    (http://mail.python.org/pipermail/python-dev/2005-January/051053.html)

.. [2] Optional Static Typing -- Stop the Flames!
    (http://www.artima.com/weblogs/viewpost.jsp?thread=87182)

.. [3] XXX Adapter Pattern

.. [4] XXX Abstract Factory Pattern


Copyright
=========

This document has been placed in the public domain.



..
    Local Variables:
    mode: indented-text
    indent-tabs-mode: nil
    sentence-end-double-space: t
    fill-column: 70
    End:



More information about the Python-Dev mailing list