[Python-3000] Fwd: Conventions for annotation consumers

Nick Coghlan ncoghlan at gmail.com
Fri Aug 18 18:18:39 CEST 2006

Phillip J. Eby wrote:
> I'm frankly baffled by the amount of "protect users from incompatibility" 
> ranting that this issue has generated.  If I wanted to use Java, I'd know 
> where to find it.  Guido has said time and again that Python's balance 
> favors the individual developer at the expense of the group where 
> "consenting adults" is concerned, and Py3K isn't intended to change that 
> balance.

I actually thought Collin's approach in the PEP was reasonable (deferring the 
details of combining annotations until we had some more experience with how 
they could be made useful in practice). Some of the wording was a little 
strong (suggesting that the conventions would *never* be develop), but the 
idea was sound.

To try and put this in perspective:

1. I believe argument annotations have the most potential to be beneficial 
when used in conjunction with a single decorator chosen or written by the 
developer to support things like Foreign Function Interface type mapping 
(PyObjC, ctypes, XML-RPC, etc), or function overloading (RuleDispatch, etc).

2. If a developer wishes to use multiple annotations together, they can define 
their own annotation processing decorator that invokes the necessary 
operations using non-annotation based APIs provided by the appropriate 
framework, many of which already exist, and will continue to exist in Py3k due 
to the need to be able to process functions which have not been annotated at 
all (such as functions written in C).

3. The question has been raised as to whether or not there is a practical way 
for a developer to use annotations that make sense to a *static* analysis tool 
that doesn't actually execute the Python code

If someone figures out a way to handle the last point *without* compromising 
the ease of use for annotations designed to handle point 1, all well and good. 
Otherwise, I'd call YAGNI. OK, annotations wouldn't be useful for tools like 
pychecker in that case. So be it - to be really useful for a tool like 
pychecker they'd have to be ubiquitous, and that's really not Python any more.

All that said, I'm still not entirely convinced that function annotations are 
a good idea in the first place - I'm inclined to believe that signature 
objects providing a "bind" method that returns a dictionary mapping the method 
call's arguments to the function's named parameters will prove far more 
useful. With this approach, the 'annotations' would continue to be supplied as 
arguments to decorator factories instead of as expressions directly in the 
function header. IOW, I've yet to see any use case that is significantly 
easier to write with function annotations instead of decorator arguments, and 
several cases where function annotations are significantly worse.

For one thing, function annotations are useless for decorating a function that 
was defined elsewhere, whereas it doesn't matter where the function came from 
when using decorator arguments. The latter also has a major benefit in 
unambiguously associating each annotation with the decorator that is the 
intended consumer.

Consider an extreme example Josiah used elsewhere in this discussion:

 > @docstring
 > @typechecker
 > @constrain_values
 > def foo(a: [doc("frobination count"),
 >             type(Number),
 >             constrain_values(range(3,9))],
 >         b: [type(Number),
 >             # This can be only 4, 8 or 12
 >             constrain_values([4,8,12])]) -> type(Number):

Here's how it looks with decorator factories instead:

# Using keyword arguments
@docstring(a="frobination count")
@typechecker(a=Number, b=Number, _return=Number)
@constrain_values(a=range(3,9), b=[4,8,12])
def foo(a, b):
     # the code

# Using positional arguments
@docstring("frobination count")
@typechecker(Number, Number, _return=Number)
@constrain_values(range(3,9), [4,8,12])
def foo(a, b):
     # the code

All the disambiguation cruft is gone, the association between the decorators 
and the values they are processing is clear, the expressions are split 
naturally across the different decorator lines, and the basic signature is 
found easily by scanning for the last line before the indented section. The 
_return=Number is a bit ugly, but that could be handled by syntactic sugar 
that processed a "->expr" in a function call as equivalent to "return=expr" 
(i.e. adding the result of the expression to the keywords dictionary under the 
key "return").

Another advantage of the decorator-with-arguments approach is that you can 
call the decorator factory once, store the result in a variable, and then 
reuse that throughout your module, which is harder with annotations directly 
in the function header (which means that you can only share single 
annotations, not combinations of annotations). For example:

floats2_to_float2tuple = typechecker(float, float, _return=(float, float))

def cartesian_to_polar(x, y):
     return math.sqrt(x*x + y*y), math.atan2(y, x)

def polar_to_cartesian(r, theta):
     return r*math.cos(theta), r*math.sin(theta)


Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

More information about the Python-3000 mailing list