[Python-3000] Draft pre-PEP: function annotations
jcarlson at uci.edu
Fri Aug 11 22:46:42 CEST 2006
"Phillip J. Eby" <pje at telecommunity.com> wrote:
> At 09:04 AM 8/11/2006 -0700, Josiah Carlson wrote:
> >I think you misunderstood Talin. While it was a pain for him to work
> >his way through implementing all of the loading/etc. protocols, I
> >believe his point was that if we allow any and all arbitrary metadata to
> >be placed on arguments to and from functions, then invariably there will
> >be multiple methods of doing as much. That isn't a problem unto itself,
> >but when there ends up being multiple metadata formats, with multiple
> >interpretations of them, and a user decides that they want to combine
> >the functionality of two metadata formats, they may be stuck due to
> >incompatibilities, etc.
> I was giving him the benefit of the doubt by assuming he was bringing up a
> *new* objection that I hadn't already answered. This "incompatibility"
> argument has already been addressed; it is trivially solved by overloaded
> functions (e.g. pickle.dump(), str(), iter(), etc.).
In effect, you seem to be saying "when user X wants to add their own
metadata with interpretation, they need to overload the previously
existing metadata interpreter". However, as has already been stated,
because there is no standard metadata interpreter, nor a standard method
for chaining metadata, how is user X supposed to overload the previously
existing metadata interpreter?
Since you brought up pickle.dump(), str(), iter(), etc., I'll point out
that str(), iter(), etc., call special methods on the defined object
(__str__, __iter__, etc.), and while pickle can have picklers be
registered, it also has a special method interface. Because all of the
metadata defined is (according to the pre-PEP) attached to a single
__signature__ attribute of the function, interpretation of the metadata
isn't as easy as calling str(obj), as you claim.
Let us say that I have two metadata interpters. One that believes that
the metadata is types and wants to verify type on function call. The
other believes that the metadata is documentation. Both were written
without regards to the other. Please describe to me (in code preferably)
how I would be able to use both of them without having a defined
metadata interpretation chaining semantic.
> >This method may or may not be good. But, if we don't define a standard
> >method for metadata to be combined from multiple protocols, etc., then
> >we could end up with incompatabilities.
> Not if you use overloaded functions to define the operations you're going
> to perform. You and Talin are proposing a problem here that is not only
> hypothetical, it's non-existent.
> Remember, PEAK already does this kind of openly-extensible metadata for
> attributes, using a single-dispatch overloaded function (analagous to
> pickle.dump). If you want to show that it's really possible to create
> "incompatible" annotations, try creating some for attributes in PEAK.
Could you at least provide a link to where it is documented how to
create metadata attributes in PEAK? My attempts to delve into PEAK
documentation has thus far failed horribly.
More information about the Python-3000