Re: [Python-Dev] Re: PEP 318: Decorators last before colon
I have to say I favor the "last before colon" approach, but if it has to be before the def, then I think it should have a keyword, and if you don't want to introduce a new keyword, then it looks like "is" is the only reasonable candidate. And if you do have a keyword, you don't need the square brackets. So you have is: classmethod def f(x): pass is: author("Guido"), signature(int, result=None) def g(x): pass
Despite some positive responses, my proposal (which was originally due to Jim Hugunin, who probably has more C# experience than all of us arguing together) is getting some tough opposition. I'd like to have a bake-off, where we view a serious amount of code using decorators to the hilt with each of three proposed syntaxes: 1) Last-before-colon: def foo(cls, arg1, arg2) [funcattrs(foo=42), deprecated, overrides, classmethod]: pass 2) Prefix list: [funcattrs(foo=42), deprecated, overrides, classmethod] def foo(cls, arg1, arg2): pass 2a) Prefix list with leading *: *[funcattrs(foo=42), deprecated, overrides, classmethod] def foo(cls, arg1, arg2): pass 3) Prefix suite (could use a different keyword than 'decorate'): decorate: funcattrs(foo=42) deprecated overrides classmethod def foo(cls, arg1, arg2): pass None of the other positions between def and arglist are acceptable any more; a keyword after the argument list isn't acceptable; I don't think any syntax that puts the decorators inside the body (between the colon and the docstring) will work. So this is the field. What I'm asking (especially of Phillip) is to collect a set of realistic method declarations using decorators; we can then collectively format these using any of the possible syntaxes, and see how they look. We can also review different ways of spreading multiple decorators across several lines, e.g. [funcattrs(foo=42), deprecated, overrides, classmethod] def foo(cls, arg1, arg2): pass --Guido van Rossum (home page: http://www.python.org/~guido/)
At 09:44 PM 3/31/04 -0800, Guido van Rossum wrote:
What I'm asking (especially of Phillip) is to collect a set of realistic method declarations using decorators; we can then collectively format these using any of the possible syntaxes, and see how they look.
I'd be happy to scrounge up some samples from existing code using 'property' and 'classmethod' as well as some of PEAK's decorators, and I definitely think that Jack Diedrich and Bob Ippolito's samples should be included as well. Important question, though: do we include code bodies, or just use 'pass' for the bodies? If we include the body, how much of the body? Should we include entire classes, especially if the class itself needs a decorator, and multiple methods have decorators? Next, does anybody have any actual use cases for attribute decoration today? We're probably not going to get a lot of that from current code samples. I can make up some examples that throw in every possible option that PEAK provides in order to get some "lots of decoration" samples, but they wouldn't be "real" uses in that case. But I guess that maybe Bob's examples might be wordy enough.
Phillip J. Eby wrote:
Next, does anybody have any actual use cases for attribute decoration today?
If you mean function attributes, Docutils uses them a lot. They're described in <http://docutils.sf.net/docutils/parsers/rst/directives/__init__.py> and used in all the modules in that directory, such as <docutils/docutils/parsers/rst/directives/images.py>. -- David Goodger
At 01:54 PM 4/1/04 -0500, David Goodger wrote:
Phillip J. Eby wrote:
Next, does anybody have any actual use cases for attribute decoration today?
If you mean function attributes, Docutils uses them a lot. They're described in <http://docutils.sf.net/docutils/parsers/rst/directives/__init__.py> and used in all the modules in that directory, such as <docutils/docutils/parsers/rst/directives/images.py>.
Thanks! So for example, this: def admonition(*args): return make_admonition(nodes.admonition, *args) admonition.arguments = (1, 0, 1) admonition.options = {'class': directives.class_option} admonition.content = 1 def attention(*args): return make_admonition(nodes.attention, *args) attention.content = 1 might be rephrased as (say): as [rst_directive( arguments=(1,0,1), options={'class': directives.class_option}, content=1 ) ] def admonition(*args): return make_admonition(nodes.admonition, *args) as [rst_directive(content=1)] def attention(*args): return make_admonition(nodes.attention, *args)
Phillip J. Eby wrote:
Thanks! So for example, this:
def admonition(*args): return make_admonition(nodes.admonition, *args)
admonition.arguments = (1, 0, 1) admonition.options = {'class': directives.class_option} admonition.content = 1 ...
might be rephrased as (say):
as [rst_directive( arguments=(1,0,1), options={'class': directives.class_option}, content=1 ) ] def admonition(*args): return make_admonition(nodes.admonition, *args)
I suppose so, but a generic function attribute decorator would do just as well. IOW, it doesn't have to be "rst_directive", just "attributes" would do fine. -- David Goodger
as [rst_directive( arguments=(1,0,1), options={'class': directives.class_option}, content=1 ) ] def admonition(*args): return make_admonition(nodes.admonition, *args)
I suppose so, but a generic function attribute decorator would do just as well. IOW, it doesn't have to be "rst_directive", just "attributes" would do fine.
I like rst_directive(), because it's more specific -- it would catch mistakes earlier. E.g. if you misspelled argumnets, attributes() would have no clue about it; but rst_directive() should have specific keywords. --Guido van Rossum (home page: http://www.python.org/~guido/)
At 03:47 PM 4/1/04 -0800, Guido van Rossum wrote:
as [rst_directive( arguments=(1,0,1), options={'class': directives.class_option}, content=1 ) ] def admonition(*args): return make_admonition(nodes.admonition, *args)
I suppose so, but a generic function attribute decorator would do just as well. IOW, it doesn't have to be "rst_directive", just "attributes" would do fine.
I like rst_directive(), because it's more specific -- it would catch mistakes earlier. E.g. if you misspelled argumnets, attributes() would have no clue about it; but rst_directive() should have specific keywords.
Yeah, that's why I wrote the example that way. Once you have an 'attributes()' decorator, it's easy to create task-specific versions of it, e.g.: def my_decorator(foo,bar,baz): return attributes(**locals()) And of course you can have defaults, and do validation of the attributes before the return statement. And there's an additional reason to use task-specific decorators: if you later decide that function attributes aren't as useful, or you need to change what the attributes are named, or you decide to stick all the options into an object and use only one attribute, you need only change the task-specific decorator, not all the things that call it.
What I'm asking (especially of Phillip) is to collect a set of realistic method declarations using decorators; we can then collectively format these using any of the possible syntaxes, and see how they look.
I'd be happy to scrounge up some samples from existing code using 'property' and 'classmethod' as well as some of PEAK's decorators, and I definitely think that Jack Diedrich and Bob Ippolito's samples should be included as well.
Important question, though: do we include code bodies, or just use 'pass' for the bodies? If we include the body, how much of the body? Should we include entire classes, especially if the class itself needs a decorator, and multiple methods have decorators?
Why not provide the bodies, for added realism? (I still think class decorators are a separate case, and much weaker -- you can do this by having a single 'decoratable' metaclass and setting __decorators__ = [...] in the class body.)
Next, does anybody have any actual use cases for attribute decoration today? We're probably not going to get a lot of that from current code samples. I can make up some examples that throw in every possible option that PEAK provides in order to get some "lots of decoration" samples, but they wouldn't be "real" uses in that case. But I guess that maybe Bob's examples might be wordy enough.
I think that SPARK syntax and everything else that people have traditionally added to docstring markup that isn't strictly speaking documentation (even some extreme cases of doctest usage) ought to be considered as candidates for attribute-ification. --Guido van Rossum (home page: http://www.python.org/~guido/)
On 1 Apr 2004 at 11:08, Guido van Rossum wrote:
I think that SPARK syntax and everything else that people have traditionally added to docstring markup that isn't strictly speaking documentation (even some extreme cases of doctest usage) ought to be considered as candidates for attribute-ification.
Where do method attribute type signatures and DBC fit in? As a decorator, or in the docstring? I'm concerned that the funcattrs(a="xyz" .. ) sample tossed around here will be rather ugly for specifying DBC strings. DBC will also need class invariants, so a funcattrs work-alike at the class level will be needed. Finally, I don't have a need to access DBC annotations at runtime once my module is distributed. I would not want to pay the memory cost overhead of loading DBC information or attribute type signatures at runtime. However another person at PyCon poo-poo'd my concern over bloating .pyc files and subsequent memory use. As a compromise I suggested that "annotation" information could go into the .pyc, but be loaded "on demand" at runtime. For example, a traceback handler might be able to use DBC info to suggest the call level that may have caused the problem. To do "on demand loading", I suggested giving .pyc files a "resource section" that could hold this meta information. Normal imports would not load the resource section, but resource information could be loaded later using another mechanism. I thought that putting meta data into a different file was more complicated, and that using -OO to "strip out" annotation was too heavy handed. The standard library might benefit from the addition of method attribute type information. However I think it would be better to not load this information at runtime unless needed. The same could be said for docstrings. If docstrings were "packed" in a resource section of the .pyc file, they might be loaded "on-demand", thereby saving some memory overhead. -- Brad Clements, bkc@murkworks.com (315)268-1000 http://www.murkworks.com (315)268-9812 Fax http://www.wecanstopspam.org/ AOL-IM: BKClements
On Apr 1, 2004, at 2:08 PM, Guido van Rossum wrote:
What I'm asking (especially of Phillip) is to collect a set of realistic method declarations using decorators; we can then collectively format these using any of the possible syntaxes, and see how they look.
I'd be happy to scrounge up some samples from existing code using 'property' and 'classmethod' as well as some of PEAK's decorators, and I definitely think that Jack Diedrich and Bob Ippolito's samples should be included as well.
Important question, though: do we include code bodies, or just use 'pass' for the bodies? If we include the body, how much of the body? Should we include entire classes, especially if the class itself needs a decorator, and multiple methods have decorators?
Why not provide the bodies, for added realism?
(I still think class decorators are a separate case, and much weaker -- you can do this by having a single 'decoratable' metaclass and setting __decorators__ = [...] in the class body.)
Here's something I wrote today.. it's a delegate for the exception handling mechanism so you can listen in on (any of) the exceptions that ObjC throws whether or not they are caught by something else.. the Python->PyObjC exceptions are logged as tracebacks and the ObjC exceptions are thrown over to atos so they turn into human-readable stack traces. As you can see, it's rather ugly with regular old Python syntax. class PyObjCDebuggingDelegate(NSObject): def exceptionHandler_shouldLogException_mask_(self, sender, exception, aMask): try: if isPythonException(exception): if self.verbosity() & LOGSTACKTRACE: nsLogObjCException(exception) return nsLogPythonException(exception) elif self.verbosity() & LOGSTACKTRACE: return nsLogObjCException(exception) else: return False except: print >>sys.stderr, "*** Exception occurred during exception handler ***" traceback.print_exc(sys.stderr) return True exceptionHandler_shouldLogException_mask_ = objc.selector(exceptionHandler_shouldLogException_mask_, signature='c@:@@I') def exceptionHandler_shouldHandleException_mask_(self, sender, exception, aMask): return False exceptionHandler_shouldHandleException_mask_ = objc.selector(exceptionHandler_shouldHandleException_mask_, signature='c@:@@I') the objc.selector signatures say that they return a char (a BOOL, actually), the following @: represents "self" and the selector (the "method name"), the next two @@ say that the sender and exception arguments are both ObjC objects, and the trailing I means that aMask is an unsigned int. -bob
At 11:08 AM 4/1/04 -0800, Guido van Rossum wrote:
What I'm asking (especially of Phillip) is to collect a set of realistic method declarations using decorators; we can then collectively format these using any of the possible syntaxes, and see how they look.
I'd be happy to scrounge up some samples from existing code using 'property' and 'classmethod' as well as some of PEAK's decorators, and I definitely think that Jack Diedrich and Bob Ippolito's samples should be included as well.
Important question, though: do we include code bodies, or just use 'pass' for the bodies? If we include the body, how much of the body? Should we include entire classes, especially if the class itself needs a decorator, and multiple methods have decorators?
Why not provide the bodies, for added realism?
Okay, here's one using two decorators, in today's syntax. It's excerpted from an I/O scheduling component in 'peak.events'. The component manages a set of read/write/error file handles, and allows pseudothreads waiting on those handles to resume when I/O is possible. The 'select()' operation is itself performed by a pseudothread called 'monitor'. Each instance of the component should have exactly one such pseudothread, which should begin running as soon as the component is "assembled" (attached to an application). Two decorators control this: 'events.taskFactory', which accepts a generator function and returns a function that returns a new pseudothread each time it's invoked. That is, it's roughly equivalent to: def taskFactory(func): return lambda *args,**kw: Task(func(*args,**kw)) except that there is some extra magic so that introspecting the returned function still shows the same argument signature. (Which is important for documentation tools like pydoc and epydoc). The second decorator is 'binding.Make', which takes a 1-argument callable and returns a descriptor that will invoke the callable only once: when the attribute is first accessed for a given instance. The result of the callable is cached in the object's instance dictionary, where it will be retrieved on any subsequent access. So, applying the two decorators (i.e. [events.taskFactory, binding.Make]) to a 1-argument function results in an attribute that will be automatically initialized when first used. By applying an extra keyword argument to 'binding.Make' in the current implementation, we can tell the descriptor to automatically initialize itself when the componet is assembled. (Note: this is not the same as __init__ time; a PEAK component is considered "assembled" when it is made the child of a component that knows its "root" component, and thus can be certain of its entire configuration environment.) So, I would probably render this example with these decorators: [events.taskFactory, binding.Make(uponAssembly=True)] in order to specify that the function is a generator that should be run as a task (pseudothread), it should be run at most once, and it should exist as soon as the component knows its configuration environment. (Note: the 'uponAssembly' bit works today only with old-style 'foo=binding.Make(foo,...)' syntax, not decorator syntax.) Anyway, here's the example: def monitor(self) [events.taskFactory, binding.Make(uponAssembly=True)]: r,w,e = self.rwe count = self.count sleep = self.scheduler.sleep() time_available = self.scheduler.time_available select = self.select error = self._error while True: yield count; resume() # wait until there are selectables yield sleep; resume() # ensure we are in top-level loop delay = time_available() if delay is None: delay = self.checkInterval try: rwe = self.select(r.keys(),w.keys(),e.keys(),delay) except error, v: if v.args[0]==EINTR: continue # signal received during select, try again else: raise for fired,events in zip(rwe,self.rwe): for stream in fired: events[stream]().send(True)
(I still think class decorators are a separate case, and much weaker -- you can do this by having a single 'decoratable' metaclass and setting __decorators__ = [...] in the class body.)
Or you can use the "class advisors" mechanism I implemented for PyProtocols and Zope 3, which is clean and convenient in today's syntax. Its only pitfall is that you absolutely *must* specify __metaclass__ first if you are specifying one, or your class advisors won't work right. Actually, the other pitfall for anybody taking this approach is that a correct implementation of class advisors is *hard*, because it depends on a correct re-implementation of much of Python's metaclass validation logic. But, at least there is one working implementation available to steal from. :)
I think that SPARK syntax and everything else that people have traditionally added to docstring markup that isn't strictly speaking documentation (even some extreme cases of doctest usage) ought to be considered as candidates for attribute-ification.
David Goodger mentioned docutils, so I mocked up a couple of 'rst_directive' examples in a seperate message.
Guido van Rossum wrote:
2a) Prefix list with leading *:
*[funcattrs(foo=42), deprecated, overrides, classmethod] def foo(cls, arg1, arg2): pass
How about adding: 2b) Prefix list with repeated keyword: def [funcattrs(foo=42), deprecated, overrides, classmethod] def foo(cls, arg1, arg2): pass class [singleton] class foo: pass 3a) Prefix suite with repeated keyword: def: funcattrs(foo=42) deprecated, overrides, classmethod def foo(cls, arg1, arg2): pass Cheers, Evan @ 4-am
How about adding:
2b) Prefix list with repeated keyword:
def [funcattrs(foo=42), deprecated, overrides, classmethod] def foo(cls, arg1, arg2): pass
class [singleton] class foo: pass
3a) Prefix suite with repeated keyword:
def: funcattrs(foo=42) deprecated, overrides, classmethod def foo(cls, arg1, arg2): pass
You know, that one occurred to me in the shower, because it should parse easily. But I immediately rejected it as too weird. The stutter doesn't have any semantic connotation to it and is bound to confuse source-scanning tools. (The nice thing about plain prefix list syntax is that tools which look for 'def' but don't process other statements are just as oblivious to it as they are to current decorator syntax.) --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum wrote:
1) Last-before-colon:
def foo(cls, arg1, arg2) [funcattrs(foo=42), deprecated, overrides, classmethod]: pass
2) Prefix list:
[funcattrs(foo=42), deprecated, overrides, classmethod] def foo(cls, arg1, arg2): pass
I bet most of us who prefer #2 would be just as happy with #1. Am I right? Shane
"Peter Norvig" <pnorvig@google.com> wrote in message news:42B02151.420DCBB8@mail.google.com...
I have to say I favor the "last before colon" approach,
The way I see it today, we are trying to select syntax to define something like a 'function metatype' (FMT) in order to get non-standard function-like objects. The idea and result strike me as similar (but *not* identical) to running the pieces of a would-be class thru a metaclass. If FMTs were required to be defined and named before use, then [FMTidentifier] would be analogous to __metaclass__ = MCidentifier after a class statement. While a metaclass can be defined in-place, over multiple lines, this does not work nearly so well for an anonymous FMT. (Has any consideration been given to an actual metafunc mechanism more directly analogous to metaclasses, that would be given the *pieces* of a would-be function (name, param names, default args, code body, etc), so that there would not necessarily ever be a standard function object?)
but if it has to be before the def, then I think it should have a keyword,
Yes. My support for 'as' was based on misremembering its quasi status.
and if you don't want to introduce a new keyword, then it looks like "is" is the only reasonable candidate. And if you do have a keyword, you don't need the square brackets.
Since the sequence is not going to be mutated, I would see them as a positive distraction.
So you have
is: classmethod def f(x): pass
is: author("Guido"), signature(int, result=None) def g(x): pass
For this usage, 'is' is semantically better than 'as' anyway. Terry J. Reedy
(Has any consideration been given to an actual metafunc mechanism more directly analogous to metaclasses, that would be given the *pieces* of a would-be function (name, param names, default args, code body, etc), so that there would not necessarily ever be a standard function object?)
Deconstructing a function like that is too invasive -- I don't want to touch the calling sequence, for example, because it's so performance critical. None of the people arguing for decorators has shown a use case for that either. However, if you really want to do that, you *can* take the function apart and construct a new one using the 'new' module. --Guido van Rossum (home page: http://www.python.org/~guido/)
>> (Has any consideration been given to an actual metafunc mechanism >> more directly analogous to metaclasses, that would be given the >> *pieces* of a would-be function (name, param names, default args, >> code body, etc), so that there would not necessarily ever be a >> standard function object?) Guido> Deconstructing a function like that is too invasive -- I don't Guido> want to touch the calling sequence, for example, because it's so Guido> performance critical. None of the people arguing for decorators Guido> has shown a use case for that either. However, if you really Guido> want to do that, you *can* take the function apart and construct Guido> a new one using the 'new' module. One thing that occurred to me is that a function's func_name attribute might be made read-write so that decorator functions can easily make that aspect of a function wrapper behave the same as the original function. Calling new.function() shouldn't be required in such simple cases. Skip
participants (10)
-
Bob Ippolito
-
Brad Clements
-
David Goodger
-
Evan Simpson
-
Guido van Rossum
-
Peter Norvig
-
Phillip J. Eby
-
Shane Hathaway
-
Skip Montanaro
-
Terry Reedy