PEP 318: Decorators last before colon
Three different positions for decorators have been suggested: (a) def [decorator] func(arg, arg): (b) def func [decorator] (arg, arg): (c) def func(arg, arg) [decorator]: There are several strong arguments to choose (c). 1. The decorator part is optional. Optional things usually come at the end and shouldn't interrupt non-optional things. 2. The decorator part can contain arbitrary expressions. This makes it impossible for most syntax colorizers to find the function name in (a). Finding the defining occurences of names is the most important thing a code navigator can do. 3. In (b) the function name is separated from the argument list, making the definition look very different from a function call. 4. When you're reading the body of the function, you will refer to the arguments frequently and the decorators not at all. So the arguments should come first, in the usual position. 5. The decorators act on the function after the entire function definition has been evaluated. It makes sense to arrange the function as a visual unit so you can see what is being created and manipulated. To see argument 5, compare these illustrations: (a) ---------- then do this | .-----. v .-----------------. | def | [decorator] | func(arg, arg): | | '-------------' | <-- first do this | print arg + arg | '-------------------------------------' (b) ----- then do this | .----------. v .-------------. | def func | [decorator] | (arg, arg): | | '-------------' | <-- first do this | print arg + arg | '--------------------------------------' (c) first do this then do this | | v | .---------------------. v | def func(arg, arg) | [decorator]: | print arg + arg | '---------------------' I claim (c) is much easier to visually understand. -- ?!ng
On Tue, 2004-03-30 at 06:17, Ka-Ping Yee wrote:
Three different positions for decorators have been suggested:
(a) def [decorator] func(arg, arg):
(b) def func [decorator] (arg, arg):
(c) def func(arg, arg) [decorator]:
Another possibility that has been suggested is [decorator] def func(arg, arg): This has some of the same readability benefits as (c). Jeremy
Another possibility that has been suggested is
[decorator] def func(arg, arg):
This has some of the same readability benefits as (c).
Except that what you show is currently valid syntax. Making a currently valid syntax mean something new is, I believe, generally frowned upon. - Josiah
Except that what you show is currently valid syntax. Making a currently valid syntax mean something new is, I believe, generally frowned upon.
Not in this case. I've thought this through and don't think I see any practical issues with this syntax. --Guido van Rossum (home page: http://www.python.org/~guido/)
Another possibility that has been suggested is
[decorator] def func(arg, arg):
And one that I currently favor. I'm out of bandwidth to participate on a msg-by-msg basis, but perhaps folks can see if they can come to terms with this solution? --Guido van Rossum (home page: http://www.python.org/~guido/)
At 01:21 PM 3/30/04 -0800, Guido van Rossum wrote:
Another possibility that has been suggested is
[decorator] def func(arg, arg):
And one that I currently favor. I'm out of bandwidth to participate on a msg-by-msg basis, but perhaps folks can see if they can come to terms with this solution?
--Guido van Rossum (home page: http://www.python.org/~guido/)
I showed this to a Python programmer at my office. He immediately asked if this: if foo: [decorator1] else: [decorator2] def func(arg,arg): ... was valid. He then further commented that it was strange to have a Python construct split across two lines, and inquired whether these variants: [decorator] def func(arg,arg): ... [decorator1, decorator2 ] def func(arg,arg): [decorator] \ def func(arg,arg): ... would be considered legal as well. I also showed him the implemented syntax of 'def func(args) [decorators]:', and he thought it seemed natural and Pythonic, and he suggested 'as' as another possible alternative. We also discussed the issue of evaluation order, and wondered if perhaps the decorator order in your latest syntax should be in reverse of the order used for decorators-at-the-end. His last comment was that he thought it seemed very Perlish to begin a Python statement with special symbols, that then modify the behavior of code on a subsequent line. Some of his questions also got me to wondering how the grammar would even handle the initial proposal, since it seems it would have to be phrased as "lists can have an optional 'NEWLINE function-definition' clause after them". Anyway, I think I could manage to live with this syntax, although I really don't like it either, for many of the same reasons. I don't think that putting decorators at the end of the definition impedes visibility unless there are a really large number of arguments or a large amount of decorator data. In both cases, you're going to need to be reading more carefully anyway. For the common cases, it's all going to be on one line anyway. Oh... one more thing... the new proposal brings back up (to a limited extent) the "what to look up" question, in a different form. More precisely, someone who has learned basic Python syntax may wonder why somebody is creating a list with 'classmethod' in it, but unless they already know about the syntax rule, they have no way to even *guess* that it has something to do with the function definition that follows. By contrast, the decorators-last syntax is visibly part of the 'def' statement and gives much more clue as to its purpose.
On Tue, Mar 30, 2004 at 01:21:18PM -0800, Guido van Rossum wrote:
Another possibility that has been suggested is
[decorator] def func(arg, arg):
And one that I currently favor. I'm out of bandwidth to participate on a msg-by-msg basis, but perhaps folks can see if they can come to terms with this solution?
The most obvious issues I see with this is are that: - grepping for "def func" won't show any sign that its decorated, and - it looks like normal syntax, so it looks like expressions should work, e.g. [decorator1] + [decorator2] def func(arg, arg): or even: get_std_decorators() # Returns a list def func(arg, arg): I'm pretty sure this isn't the intention, though. I can imagine some posts to comp.lang.python asking how to do the above... Neither of these are fatal flaws, but I think I still prefer: def func(args) [decorator]: -Andrew.
The most obvious issues I see with this is are that: - grepping for "def func" won't show any sign that its decorated, and
Seems rather minor -- there's only so much you can cram on a single line.
- it looks like normal syntax, so it looks like expressions should work, e.g.
[decorator1] + [decorator2] def func(arg, arg):
or even:
get_std_decorators() # Returns a list def func(arg, arg):
I'm pretty sure this isn't the intention, though. I can imagine some posts to comp.lang.python asking how to do the above...
The same syntactic restrictions apply to docstrings (you can't have an expression of type string in the docstring position, it *has* to be a literal). Nobody's confused by that one.
Neither of these are fatal flaws, but I think I still prefer:
def func(args) [decorator]:
Which hides 'classmethod' behind the more voluminous stuff, and that's my main gripe. My main reasoning is as follows: 1) If we do decorators at all, decorators should be allowed to be arbitrary expressions. 2) Since we allow arbitrary expressions, decorators provide a much better way to set function attributes than the current way. 3) This will be attractive (better than putting special mark-up in docstrings), so there will be lots of voluminous decorators. 4) Then the "important" decorators like classmethod will be hidden at the end of the list of decorators. The [...] prefix proposal addresses this by putting the end of the decorator list closest to the def keyword. This doesn't look so bad: [funcattr(spark="<spark syntax here>", deprecated=True, overrides=True), classmethod] def foo(cls, arg1, arg2): pass --Guido van Rossum (home page: http://www.python.org/~guido/)
At 02:58 PM 3/30/04 -0800, Guido van Rossum wrote:
3) This will be attractive (better than putting special mark-up in docstrings), so there will be lots of voluminous decorators.
4) Then the "important" decorators like classmethod will be hidden at the end of the list of decorators.
Hm. So if we reversed the order so that the outermost decorators (such as classmethod) come first in the list, would that sway you to relent in favor of decorators-after-arguments? I don't like the reversed order, but I think I'd be a lot more comfortable with explaining that relatively minor semantic oddity to other developers than I would be with trying to explain the major syntactic oddity (relative to the rest of the Python language) of decorators-before-def.
Hm. So if we reversed the order so that the outermost decorators (such as classmethod) come first in the list, would that sway you to relent in favor of decorators-after-arguments?
Not really, because they're still hidden behind the argument list.
I don't like the reversed order, but I think I'd be a lot more comfortable with explaining that relatively minor semantic oddity to other developers than I would be with trying to explain the major syntactic oddity (relative to the rest of the Python language) of decorators-before-def.
OTOH to C# programmers you won't have to explain a thing, because that's what C# already does. --Guido van Rossum (home page: http://www.python.org/~guido/)
At 03:43 PM 3/30/04 -0800, Guido van Rossum wrote:
Hm. So if we reversed the order so that the outermost decorators (such as classmethod) come first in the list, would that sway you to relent in favor of decorators-after-arguments?
Not really, because they're still hidden behind the argument list.
Because: 1) it won't be on the same line if there are lots of arguments 2) nobody will read past the argument list 3) other 4) all of the above 5) none of the above (Not trying to change your opinion; I just think the answer to this should go in the PEP.)
I don't like the reversed order, but I think I'd be a lot more comfortable with explaining that relatively minor semantic oddity to other developers than I would be with trying to explain the major syntactic oddity (relative to the rest of the Python language) of decorators-before-def.
OTOH to C# programmers you won't have to explain a thing, because that's what C# already does.
Correct me if I'm wrong, but I don't believe C# attributes have anything like the same semantics as Python decorators; in fact I believe they may be more akin to Python function attributes! So, even to a C# programmer, I'll have to explain Python's semantics. Indeed, the C# syntax has things like this: [ReturnValue: whatever(something)] to specify what the attributes apply to, and they can be applied to parameters, the return value, the module as a whole, etc. But I don't want to get too far off-topic. By the way, you didn't mention whether it's okay to put the decorators on the same logical line, e.g.: [classmethod] def foo(bar,baz): # body goes here If the rationale here is that we're copying C#, I'd think that it should be permissible, even though it looks a bit ugly and tempts me to indent the body to align with the function name.
Hm. So if we reversed the order so that the outermost decorators (such as classmethod) come first in the list, would that sway you to relent in favor of decorators-after-arguments?
Not really, because they're still hidden behind the argument list.
Because:
1) it won't be on the same line if there are lots of arguments 2) nobody will read past the argument list 3) other 4) all of the above 5) none of the above
(Not trying to change your opinion; I just think the answer to this should go in the PEP.)
1&2, mostly.
I don't like the reversed order, but I think I'd be a lot more comfortable with explaining that relatively minor semantic oddity to other developers than I would be with trying to explain the major syntactic oddity (relative to the rest of the Python language) of decorators-before-def.
OTOH to C# programmers you won't have to explain a thing, because that's what C# already does.
Correct me if I'm wrong, but I don't believe C# attributes have anything like the same semantics as Python decorators; in fact I believe they may be more akin to Python function attributes! So, even to a C# programmer, I'll have to explain Python's semantics.
Yes of course, but the basic idea that this specifies attributes should be clear to them. You always have to explain Python's semantics -- even assignment is deeply different!
Indeed, the C# syntax has things like this:
[ReturnValue: whatever(something)]
to specify what the attributes apply to, and they can be applied to parameters, the return value, the module as a whole, etc. But I don't want to get too far off-topic.
By the way, you didn't mention whether it's okay to put the decorators on the same logical line, e.g.:
[classmethod] def foo(bar,baz): # body goes here
If the rationale here is that we're copying C#, I'd think that it should be permissible, even though it looks a bit ugly and tempts me to indent the body to align with the function name.
This is much harder to do with the current parser. (My plan would be to tie the list expression and the function definition together in the code generating phase, just like doc strings.) --Guido van Rossum (home page: http://www.python.org/~guido/)
At 04:38 PM 3/30/04 -0800, Guido van Rossum wrote:
Hm. So if we reversed the order so that the outermost decorators (such as classmethod) come first in the list, would that sway you to relent in favor of decorators-after-arguments?
Not really, because they're still hidden behind the argument list.
Because:
1) it won't be on the same line if there are lots of arguments 2) nobody will read past the argument list 3) other 4) all of the above 5) none of the above
(Not trying to change your opinion; I just think the answer to this should go in the PEP.)
1&2, mostly.
There appears to be a strong correlation between people who have specific use cases for decorators, and the people who want the last-before-colon syntax. Whereas, people who have few use cases (or don't like decorators at all) appear to favor syntaxes that move decorators earlier. Whether that means the "earlier" syntaxes are better or worse, I don't know. <0.5 wink> I'll assume your intent is to prevent decorators from biting the unwary -- specifically people who *don't* use decorators a lot and therefore are not looking for them. I will therefore focus now on the issues with the "previous line" syntax that may bite people, with an eye to how they might be fixed.
By the way, you didn't mention whether it's okay to put the decorators on the same logical line, e.g.:
[classmethod] def foo(bar,baz): # body goes here
If the rationale here is that we're copying C#, I'd think that it should be permissible, even though it looks a bit ugly and tempts me to indent the body to align with the function name.
This is much harder to do with the current parser. (My plan would be to tie the list expression and the function definition together in the code generating phase, just like doc strings.)
Yeah, this is also going to now have to be a special case for documentation processing tools. Whereas making it part of the definition syntax, it's more directly available in the parse tree. It also seems to be working against the AST branch a bit, in that I would expect the decorator expressions to be part of the function definition node, rather than in an unrelated statement. And, it's also going to be interesting to document in the language reference, since the grammar there is going to continue to diverge from the "real" grammar used by the implementation. Another issue... is this valid? [classmethod] def foo(bar,baz): pass How about this? [classmethod] # Okay, now we're going to define something... def foo(bar,baz): pass If they *are* valid, then you can have nasty effects at a distance. If they *aren't* valid, accidentally adding or removing whitespace or comments can silently change the meaning of the program, and *not* in a DWIMish way. I personally would rather have the decorators required to be on the same logical line, and then use: [classmethod] \ def foo(bar,baz): pass for visual separation. The backslash visually alerts that this is *not* a mere bare list. I'm not a parser guru by any stretch of the imagination, but wouldn't it be possible to simply create a statement-level construct that was something like: liststmt: '[' [listmaker] ']' ( funcdef | restofliststmt ) and put it where it matches sooner than the expression-based versions of the statement? It seems like the main complexity would be the possibility of having to duplicate a number of levels of containing rules for 'restofliststmt'. But maybe I'm completely off base here and there's no sensible way to define a correct 'restofliststmt'.
On Wed, Mar 31, 2004 at 12:32:35PM -0500, Phillip J. Eby wrote:
I personally would rather have the decorators required to be on the same logical line, and then use:
[classmethod] \ def foo(bar,baz): pass
for visual separation. The backslash visually alerts that this is *not* a mere bare list.
I agree with Phillip about the backslash. But I don't like this variant because it appears to operate by side-effect. If the list had a keyword after it, that wouldn't be as bad, but the only current keywords I can think of (in, for) don't fit well and would overload their meaning. I don't really like any variants, but the original seems the least bad to me: def foo(bar, baz) [classmethod]: Some of the concerns deal with the decorators getting lost after the arguments or they are too far away from the function name. It seems to me that if formatted properly, this isn't as big of a deal: def foo(cls, lots, of, arguments, that, will, not, fit, on, a, single, line) \ [classmethod, decorate(author='Someone', version='1.2.3', other='param')]: """The docstring goes here.""" I hope that's a pretty unrealistic case. I think all of the proposed variants are ugly with the definition above. But, this may be more reasonable: def foo(cls, lots, of, arguments, all, on, a line) \ [classmethod, decorate(author='Someone', version='1.2.3', other='param')]: """The docstring goes here.""" Writing decorators this way is the least surprising to me. Although, I wish there was a better alternative. Neal
Neal Norwitz wrote:
[...] def foo(cls, lots, of, arguments, that, will, not, fit, on, a, single, line) \ [classmethod, decorate(author='Someone', version='1.2.3', other='param')]: """The docstring goes here."""
I hope that's a pretty unrealistic case. I think all of the proposed variants are ugly with the definition above. But, this may be more reasonable:
def foo(cls, lots, of, arguments, all, on, a line) \ [classmethod, decorate(author='Someone', version='1.2.3', other='param')]: """The docstring goes here."""
Writing decorators this way is the least surprising to me. Although, I wish there was a better alternative.
Why not make the def look like a function call, i.e.: def(classmethod, decorate(author='Someone', version='1.2.3', other='param')) \ foo(cls, lots, of, arguments, all, on, a line): """The docstring goes here.""" Bye, Walter Dörwald
There appears to be a strong correlation between people who have specific use cases for decorators, and the people who want the last-before-colon syntax. Whereas, people who have few use cases (or don't like decorators at all) appear to favor syntaxes that move decorators earlier. Whether that means the "earlier" syntaxes are better or worse, I don't know. <0.5 wink>
Maybe the practitioners are so eager to have something usable that they aren't swayed as much by esthetics.
I'll assume your intent is to prevent decorators from biting the unwary -- specifically people who *don't* use decorators a lot and therefore are not looking for them. I will therefore focus now on the issues with the "previous line" syntax that may bite people, with an eye to how they might be fixed.
They should still be very unlikely to accidentally create one. With proper vertical whitespace, the fact that a decorator list (written properly) means something special should be obvious to even the most casual observer: there should be a blank line before the decorator list and none between it and the 'def' line. The human brain is a lot more flexible in picking up patterns than the Python parser; as shown many times in this discussion, most people have no clue about the actual syntax accepted by the Python parser, and simply copy (and generalize!) patterns they see in examples. My goal is to help people grasp the gist of a particular construct without having to think too much about the exact syntax; within all sorts of other constraints of course, like being parsable with the simple ans stupid LL(1) parser, and being easy to grep (in some cases).
By the way, you didn't mention whether it's okay to put the decorators on the same logical line, e.g.:
[classmethod] def foo(bar,baz): # body goes here
If the rationale here is that we're copying C#, I'd think that it should be permissible, even though it looks a bit ugly and tempts me to indent the body to align with the function name.
This is much harder to do with the current parser. (My plan would be to tie the list expression and the function definition together in the code generating phase, just like doc strings.)
Yeah, this is also going to now have to be a special case for documentation processing tools.
I hadn't thought of those, but the new situation can't be worse than before (decorations following the 'def'). Usually these tools either use some ad-hoc regex-based parsing, which shouldn't have any problems picking out at least the typical forms, or they (hopefully) use the AST produced by the compiler package -- it should be possible for that package to modify the parse tree so that decorators appear as part of the Function node.
Whereas making it part of the definition syntax, it's more directly available in the parse tree.
Definitely, but I don't see it as a showstopper. I've now implemented this for the traditional compile.c; I think the effort is fairly moderate. (My patch is smaller than Michael Hudson's.)
It also seems to be working against the AST branch a bit, in that I would expect the decorator expressions to be part of the function definition node, rather than in an unrelated statement. And, it's also going to be interesting to document in the language reference, since the grammar there is going to continue to diverge from the "real" grammar used by the implementation.
Hardly worse than the difference between theory and practice for assignment statements or keyword parameters.
Another issue... is this valid?
[classmethod]
def foo(bar,baz): pass
Yes.
How about this?
[classmethod] # Okay, now we're going to define something... def foo(bar,baz): pass
Yes.
If they *are* valid, then you can have nasty effects at a distance.
Given that there really isn't much of a use case for putting a list display on a line by itself (without assigning it to something), I don't see this as a likely accident. By giving only sane examples we'll help people write readable code (I see no other way; you can't force people to write unobfuscated code :-).
If they *aren't* valid, accidentally adding or removing whitespace or comments can silently change the meaning of the program, and *not* in a DWIMish way.
I personally would rather have the decorators required to be on the same logical line, and then use:
[classmethod] \ def foo(bar,baz): pass
for visual separation. The backslash visually alerts that this is *not* a mere bare list.
Ugly, and the stupid LL(1) parser can't parse that -- it needs a unique initial symbol for each alternative at a particular point.
I'm not a parser guru by any stretch of the imagination, but wouldn't it be possible to simply create a statement-level construct that was something like:
liststmt: '[' [listmaker] ']' ( funcdef | restofliststmt )
and put it where it matches sooner than the expression-based versions of the statement?
Maybe in SPARK, but not in Python's LL(1) parser.
It seems like the main complexity would be the possibility of having to duplicate a number of levels of containing rules for 'restofliststmt'. But maybe I'm completely off base here and there's no sensible way to define a correct 'restofliststmt'.
You can assume that's the case. :) --Guido van Rossum (home page: http://www.python.org/~guido/)
At 12:49 PM 3/31/04 -0800, Guido van Rossum wrote:
There appears to be a strong correlation between people who have specific use cases for decorators, and the people who want the last-before-colon syntax. Whereas, people who have few use cases (or don't like decorators at all) appear to favor syntaxes that move decorators earlier. Whether that means the "earlier" syntaxes are better or worse, I don't know. <0.5 wink>
Maybe the practitioners are so eager to have something usable that they aren't swayed as much by esthetics.
I appreciate the aesthetics of the new syntax, it's just the little implementation nits that bother me. And it just seems so un-Python to have a special syntax that doesn't start with an introducing keyword.
I'll assume your intent is to prevent decorators from biting the unwary -- specifically people who *don't* use decorators a lot and therefore are not looking for them. I will therefore focus now on the issues with the "previous line" syntax that may bite people, with an eye to how they might be fixed.
They should still be very unlikely to accidentally create one.
With proper vertical whitespace, the fact that a decorator list (written properly) means something special should be obvious to even the most casual observer: there should be a blank line before the decorator list and none between it and the 'def' line.
The human brain is a lot more flexible in picking up patterns than the Python parser; as shown many times in this discussion, most people have no clue about the actual syntax accepted by the Python parser, and simply copy (and generalize!) patterns they see in examples.
My goal is to help people grasp the gist of a particular construct without having to think too much about the exact syntax; within all sorts of other constraints of course, like being parsable with the simple ans stupid LL(1) parser, and being easy to grep (in some cases).
I previously thought these were related. That is, I thought that keeping it to an LL(1) grammar was intended to make it easier for humans to parse, because no backtracking is required. But, the new syntax *does* require backtracking, even on the same line. Granted that humans don't literaly read a character stream the way a parser does, isn't this still the first piece of backtracking-required syntax in Python?
I hadn't thought of those, but the new situation can't be worse than before (decorations following the 'def'). Usually these tools either use some ad-hoc regex-based parsing, which shouldn't have any problems picking out at least the typical forms, or they (hopefully) use the AST produced by the compiler package -- it should be possible for that package to modify the parse tree so that decorators appear as part of the Function node.
Good point. Will this be true for the AST branch's AST model as well?
It seems like the main complexity would be the possibility of having to duplicate a number of levels of containing rules for 'restofliststmt'. But maybe I'm completely off base here and there's no sensible way to define a correct 'restofliststmt'.
You can assume that's the case. :)
Yeah, I just spent a few minutes taking a look at the actual parser implementation. :( For some reason I thought that the simple LL(1) grammar meant there was also a simple recursive-descent parser. I'd completely forgotten about 'pgen' et al.
On Wed, 2004-03-31 at 16:12, Phillip J. Eby wrote:
At 12:49 PM 3/31/04 -0800, Guido van Rossum wrote:
I hadn't thought of those, but the new situation can't be worse than before (decorations following the 'def'). Usually these tools either use some ad-hoc regex-based parsing, which shouldn't have any problems picking out at least the typical forms, or they (hopefully) use the AST produced by the compiler package -- it should be possible for that package to modify the parse tree so that decorators appear as part of the Function node.
Good point. Will this be true for the AST branch's AST model as well?
If decorators get added, that's my plan. Jeremy
>> Yeah, this is also going to now have to be a special case for >> documentation processing tools. Guido> I hadn't thought of those, but the new situation can't be worse Guido> than before (decorations following the 'def'). Except for the fact that when colorizing [d1, d2] def func(a,b,c): pass "[d1, d2]" should probably not be colored the way other lists are because semantically it's not just a list that gets discarded. Skip
I think most of the worry about long, complicated decorator lists with baroque parameters is FUD. What are the realistic use cases for such things? I don't remember seeing any proposed. The ones I have seen proposed (e.g. PyObjC) use only one or two decorators at a time, and if there are any parameters, they're not lengthly. Furthermore, if in some circumstance someone finds themselves writing decorator lists so long that they become unreadable, well, there are ways of dealing with that. For example, foo_decorators = decorator_list( # Big long-winded # list of # decorators ) def foo(args) [foo_decorators]: ... which is not much different from Guido's preceding-list version, with the advantage that it's explicit about the fact that you need to look above to get the full picture. Please let's not allow unfounded speculation to get in the way of coming up with a nice syntax that works well in the majority of cases we expect to actually see. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
On Fri, Apr 02, 2004 at 12:05:39PM +1200, Greg Ewing wrote:
I think most of the worry about long, complicated decorator lists with baroque parameters is FUD. What are the realistic use cases for such things? I don't remember seeing any proposed.
The ones I have seen proposed (e.g. PyObjC) use only one or two decorators at a time, and if there are any parameters, they're not lengthly.
Furthermore, if in some circumstance someone finds themselves writing decorator lists so long that they become unreadable, well, there are ways of dealing with that. For example,
foo_decorators = decorator_list( # Big long-winded # list of # decorators )
def foo(args) [foo_decorators]: ...
I was actually about post the same in the bake-off thread, but with the super-obvious see_above = decorator_list( decorator1, .. decoratorN, ) def foo(args) [see_above]: ... You can't beat that for obviousness, even newbies know _something_ is going on and it happens just above. no_joy = decorator_list( decorator1, .. decoratorN, ) [no_joy] def foo(args): # modified by no_joy So the best case is a comment that will vary by style. Do newbies read comments? dunno. -jackdied
Another thought about: [decorator] def foo(args): ... What are newcomers going to make of this? It looks like a thrown-away list followed by a def, and there's no clue that there's any connection between the two. And even if the newcomer suspects that there is some kind of connection, what are they going to look up in the manual? At least with def foo(args) [decorator]: ... it's clear that (1) something different is going on, and (2) it's somehow a part of the 'def' statement. Everywhere else in Python where you can say something someotherthing it means "do something and then do someotherthing", and the combined result can be deduced from knowing what something and someotherthing each do on their own. This breaks that. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
On Thu, 2004-04-01 at 22:34, Greg Ewing wrote:
Another thought about:
[decorator] def foo(args): ...
What are newcomers going to make of this?
It looks like a thrown-away list followed by a def, and there's no clue that there's any connection between the two.
I think it's obvious that there's a connection between the two. A bare list by itself would be nonsense and the actual list would probably say something like [classmethod] which suggests it's saying something about a method.
And even if the newcomer suspects that there is some kind of connection, what are they going to look up in the manual?
Maybe they'll lookup functions in the reference manual? I assume they'd be just as puzzled by: def f(x, y, *args, **kwargs): Who came up with that bizarre syntax? <wink> Jeremy
On Thu, 2004-04-01 at 23:28, Jeremy Hylton wrote:
On Thu, 2004-04-01 at 22:34, Greg Ewing wrote:
Another thought about:
[decorator] def foo(args): ...
What are newcomers going to make of this?
It looks like a thrown-away list followed by a def, and there's no clue that there's any connection between the two.
I think it's obvious that there's a connection between the two. A bare list by itself would be nonsense and the actual list would probably say something like [classmethod] which suggests it's saying something about a method.
It may be nonsense, but it means something today. So it can't be obvious that they're connected because today, they aren't. If tomorrow this same code means something different, users looking at the code will have to know what version of Python they're using, and make sure it's the right one ("uh, how do I do that?"). If they were to use decorator-before-def code in an older version of Python, the program would be accepted but silently do the wrong thing. At least with decorator-before-colon trying to run the code in older interpreters will barf loudly. -Barry
Barry Warsaw
If tomorrow this same code means something different, users looking at the code will have to know what version of Python they're using, and make sure it's the right one ("uh, how do I do that?"). If they were to use decorator-before-def code in an older version of Python, the program would be accepted but silently do the wrong thing.
At least with decorator-before-colon trying to run the code in older interpreters will barf loudly.
I think this is a good point that hadn't occurred to me: silent failures suck. Cheers, mwh -- Like most people, I don't always agree with the BDFL (especially when he wants to change things I've just written about in very large books), ... -- Mark Lutz, http://python.oreilly.com/news/python_0501.html
Barry Warsaw
At least with decorator-before-colon trying to run the code in older interpreters will barf loudly.
On Friday 02 April 2004 11:34 am, Michael Hudson wrote:
I think this is a good point that hadn't occurred to me: silent failures suck.
There's also the little matter that using syntax only, without new keywords, allows us to avoid __future__ imports and all the implementation cruft that entails. Decorator-before-colon wins out big time for me. -Fred -- Fred L. Drake, Jr. <fdrake at acm.org> PythonLabs at Zope Corporation
"Fred L. Drake, Jr."
Barry Warsaw
writes: At least with decorator-before-colon trying to run the code in older interpreters will barf loudly.
On Friday 02 April 2004 11:34 am, Michael Hudson wrote:
I think this is a good point that hadn't occurred to me: silent failures suck.
There's also the little matter that using syntax only, without new keywords, allows us to avoid __future__ imports and all the implementation cruft that entails.
Decorator-before-colon wins out big time for me.
+1 -- Dave Abrahams Boost Consulting www.boost-consulting.com
Michael Hudson
Barry Warsaw
writes: If tomorrow this same code means something different, users looking at the code will have to know what version of Python they're using, and make sure it's the right one ("uh, how do I do that?"). If they were to use decorator-before-def code in an older version of Python, the program would be accepted but silently do the wrong thing.
At least with decorator-before-colon trying to run the code in older interpreters will barf loudly.
I think this is a good point that hadn't occurred to me: silent failures suck.
Wouldn't the decorator-before-def require a 'from __future__ import decorators' (Although I'm still on favor of the decorator-before-colon version)? Thomas
On Fri, 2004-04-02 at 08:24, Barry Warsaw wrote:
It may be nonsense, but it means something today. So it can't be obvious that they're connected because today, they aren't.
Your original complaint was: "What are newcomers going to make of this?" Newcomers aren't going to be worried about the small change in semantics that decorator-before would entail, because they won't know the old semantics. It's only a problem for people who already know the language, but one would hope they read the "What's New in Python 2.4" document. I expect a newcomer to read the code in the most natural way. Only a language lawyer would worry about whether the expression "[classmethod]" was evaluated and thrown away rather than referring to fromFoo. Language lawyers should be clever enough to figure out the new rules :-). class Quux: def __init__(self): ... [classmethod] def fromFoo(cls, arg): """Create a Quux instance from a Foo.""" ...
If tomorrow this same code means something different, users looking at the code will have to know what version of Python they're using, and make sure it's the right one ("uh, how do I do that?"). If they were to use decorator-before-def code in an older version of Python, the program would be accepted but silently do the wrong thing.
I agree there's a risk here, but we've faced that kind of risk before. We used future statements for nested scopes, but only one version. If you're looking at code with free variables, you need to know whether it was written to run with or without nested scopes. Code written for one version will fail at runtime on the other. (It may or may not fail silently.) Jeremy
On Fri, 2004-04-02 at 12:05, Jeremy Hylton wrote:
On Fri, 2004-04-02 at 08:24, Barry Warsaw wrote:
It may be nonsense, but it means something today. So it can't be obvious that they're connected because today, they aren't.
Your original complaint was: "What are newcomers going to make of this?" Newcomers aren't going to be worried about the small change in semantics that decorator-before would entail, because they won't know the old semantics.
They already know the old semantics. If they were to encounter that syntax today, they'd know that Python throws away the results. If /I/ saw that construct in today's Python code, I'd start looking for side effects.
If tomorrow this same code means something different, users looking at the code will have to know what version of Python they're using, and make sure it's the right one ("uh, how do I do that?"). If they were to use decorator-before-def code in an older version of Python, the program would be accepted but silently do the wrong thing.
I agree there's a risk here, but we've faced that kind of risk before. We used future statements for nested scopes, but only one version. If you're looking at code with free variables, you need to know whether it was written to run with or without nested scopes. Code written for one version will fail at runtime on the other. (It may or may not fail silently.)
Adding a future statement would make decorator-before-def slightly more acceptable. Adding a keyword to introduce decorator-before-def would make it slightly more acceptable. All-in-all, I still think decorator-before-colon is plenty readable and the more obvious of the two choices. -Barry
While I ponder the decorator syntax, let's propose some built-in decorators. We've already got classmethod and staticmethod. I propose this one to set function attributes: class func_attrs(objects): def __init__(self, **kwds): self.attrs = kwds def __call__(self, funcobj): funcobj.__dict__.update(self.attrs) Why a class? So you can subclass it like this: class rst_attrs(func_attrs): def __init__(self, arguments, options, content): func_attrs.__init__(self, arguments=arguments, options=options, content=content) Why would you want to do that? So that mistakes in the attribute names would be caught early, and perhaps default values could be provided. We could also add a standard implementation of synchronized. Or perhaps that should be imported from threading. (But is that really a good thing to copy from Java?) Other possibilities (all these are pretty much thinking aloud and very much up for votes; and not original, I've seen these proposed before!): deprecated(comment_string) -- mostly for documentation, but would set the deprecated attribute. The comment_string argument is a string explaining why it is deprecated. Maybe this should also have arguments specifying the first Python release in which it was considered deprecated and the last Python release (if known) where it will be available. overrides -- indicates that this overrides a base class method. Maybe the default metaclass could check that if this is used there actually is a corresponding base class method, and we might have a "strict" metaclass that checks this is set for all overriding methods. doctest_script(multi_line_string) -- specifies a doctest script, for use by the doctest module (if this is absent, it will continue to look in the doc string). I like to separate the doctest script from the actual documentation string because, when used for rigorous unit testing (as in Jim&Tim's talk at PyCON), the doctest script is too long to double as reasonable documentation. Maybe this should be imported from the doctest module. decorator_chain(*args) -- takes any number of decorator arguments and applies them sequentially. Implementation could be: def decorator_chain(*args): def decorate(func): for arg in args: func = arg(func) return func return decorate (I don't see a reason to do this with a class.) I'm not a fan of using function attributes for specifying author, version, copyright etc.; those things usually work on a larger granularity than methods anyway, and belong in the module doc string. I'm still torn whether to promote defining properties this way: [propget] def x(self): "Doc string for x" return self.__x [propset] def x(self, newx): self.__x = newx [propdel] def x(self): del self.__x but if people like this (whatever the decorator syntax :) we might as well make this the recommended way to define properties. Should there be a a separate module from which all those decorators are imported, or should we make them built-ins, following the trend set by classmethod etc.? Not counting the ones that I already mark as to be imported from a special place, like synchronized (from threading.py) and doctest_script (from doctest.py). Proposals for other standard decorators are welcome! --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum wrote:
While I ponder the decorator syntax, let's propose some built-in decorators.
We've already got classmethod and staticmethod.
I propose this one to set function attributes:
class func_attrs(objects):
def __init__(self, **kwds): self.attrs = kwds
def __call__(self, funcobj): funcobj.__dict__.update(self.attrs)
Did you leave out the 'return funcobj' from the end of __call__? I thought that decorators were supposed to be inherently cooperative, and should return their modified funcobj, or a new func-like-obj. Or maybe I haven't been paying close enough attention... -Kevin
class func_attrs(objects):
def __init__(self, **kwds): self.attrs = kwds
def __call__(self, funcobj): funcobj.__dict__.update(self.attrs)
Did you leave out the 'return funcobj' from the end of __call__? I thought that decorators were supposed to be inherently cooperative, and should return their modified funcobj, or a new func-like-obj.
Sorry, you're right. (I've been thinking of interpreting a None result as "keep the input object" but the code generation would be too messy. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum wrote:
class func_attrs(objects):
def __init__(self, **kwds): self.attrs = kwds
def __call__(self, funcobj): funcobj.__dict__.update(self.attrs)
Did you leave out the 'return funcobj' from the end of __call__? I thought that decorators were supposed to be inherently cooperative, and should return their modified funcobj, or a new func-like-obj.
Sorry, you're right. (I've been thinking of interpreting a None result as "keep the input object" but the code generation would be too messy.
Cool. And now that I have my pedantic hat on, I may as well go all out. First, why call it func_attrs, when staticmethod and classmethod are underscoreless? Second, I know it is effectively the same, but shouldn't the .update line use vars(funcobj) instead of funcobj.__dict__? This is something that I am asked (often!) by my Python students. I use vars(obj) since it looks less magical. -Kevin
Cool. And now that I have my pedantic hat on, I may as well go all out. First, why call it func_attrs, when staticmethod and classmethod are underscoreless?
Just for variety. :-) We had a discussion about naming conventions recently, and I believe the outcome was that most folks like underscores in their names. OTOH if I was writing this just for *me*, I would indeed make them underscoreless.
Second, I know it is effectively the same, but shouldn't the .update line use vars(funcobj) instead of funcobj.__dict__? This is something that I am asked (often!) by my Python students. I use vars(obj) since it looks less magical.
No, vars() might return a copy. __dict__ is the real thing (unless it isn't -- try updating the __dict__ of a new-style class :-). But perhaps the real implementation should use setattr() anyway -- some function attributes are not stored in the __dict__ (like __doc__) and these should still be settable this way (if they are settable at all). And if they are *not* settable, this should raise an error rather than silently creating an inaccessible entry in __dict__. So let's rephrase that class as: class funcattrs(object): def __init__(self, **kwds): self.attrs = kwds def __call__(self, func): for name, value in self.attrs.iteritems(): setattr(func, name, value) return func --Guido van Rossum (home page: http://www.python.org/~guido/)
On Fri, 2004-04-02 at 15:32, Guido van Rossum wrote:
We had a discussion about naming conventions recently, and I believe the outcome was that most folks like underscores in their names. OTOH if I was writing this just for *me*, I would indeed make them underscoreless.
Actually, for multi-word method and function names, we preferred underscores to camelCasing. But I still wish has_key() were haskey() and since neither "func" nor "attr" are actual words, funcattrs() is just fine. :) I'm torn on whether new decorators should go in as built-ins or not. It's darn convenient to stick them there, but I worry that the documentation for the builtin module is getting pretty hefty now (and no, I really don't want to do something like an os module split -- man, I hate those pages ;). -Barry
At 03:15 PM 4/2/04 -0500, Kevin Jacobs wrote:
Guido van Rossum wrote:
class func_attrs(objects):
def __init__(self, **kwds): self.attrs = kwds
def __call__(self, funcobj): funcobj.__dict__.update(self.attrs)
Did you leave out the 'return funcobj' from the end of __call__? I thought that decorators were supposed to be inherently cooperative, and should return their modified funcobj, or a new func-like-obj.
Sorry, you're right. (I've been thinking of interpreting a None result as "keep the input object" but the code generation would be too messy.
Cool. And now that I have my pedantic hat on, I may as well go all out. First, why call it func_attrs, when staticmethod and classmethod are underscoreless? Second, I know it is effectively the same, but shouldn't the .update line use vars(funcobj) instead of funcobj.__dict__? This is something that I am asked (often!) by my Python students. I use vars(obj) since it looks less magical.
For that matter, updating the dictionary is nice as a quick trick, but what if the object doesn't *have* a dictionary? Using setattr would be safer, if this is intended to be subclassed. Suppose, for example, that 'synchronized' actually returns a callable wrapper written in C, rather than a function object, and that wrapper then delegates setattr to the thing it wraps? So, I think using setattr would be safer, and __dict__.update() is premature optimization, given that function object creation and decorator invocation isn't likely to be a performance bottleneck.
[Guido]
... doctest_script(multi_line_string) -- specifies a doctest script, for use by the doctest module (if this is absent, it will continue to look in the doc string). I like to separate the doctest script from the actual documentation string because, when used for rigorous unit testing (as in Jim&Tim's talk at PyCON), the doctest script is too long to double as reasonable documentation. Maybe this should be imported from the doctest module.
Let me try to squash this one before it gains traction from people who don't use doctest anyway <wink>. Burying a large triple-quoted string inside a function call inside an indented "list" isn't going to help the readability of anything. Zope use is moving toward putting large doctests into their own .py file, or, more recently, into their own .txt file. An example of the former: http://cvs.zope.org/ZODB/src/ZODB/tests/testmvcc.py The idea that "too long" makes for unreasonable documentation doesn't make sense to me: these long doctests read like tutorials liberally sprinkled with Python examples, and are the best docs some of these subsystems will ever have. Indeed, the Python Tutorial is a prime example of something that "should be" a doctest, and I think that's exceptionally good documentation. Of course this doesn't preclude putting a judiciously few examples in docstrings either (also for doctest chewing), but I'd rather encourage people who want more than that to go all the way, and write a tutorial-style large doctest in its own file.
While I ponder the decorator syntax, let's propose some built-in decorators.
Okay, I'll play (despite my thumbs down on the pep and especially the syntax using a list prefixing the function). Several people have emphasized that the timing of decorator application is only an implementation detail. So, that leaves open the possibility of using function decorators as compilation pragmas. One possible pragma tells the compiler that the function may assume that specified global variables are constant and already known. This allows the globals to be loaded in the constant table and accessed with LOAD_GLOBAL instead of LOAD_CONSTANT. Another pragma tells the compiler to assume that specified attribute lookup targets won't change outside the function. This allows the lookups to be precomputed as soon as the root is available or whenever it changes. Example ------- X = 20 [earlybind(int, X), cacheattr("mylist.append", "result.append", "random.random", "self.score")] def meth(self, data) result = [] for sub in data: mylist = [] for elem in sub: if int(elem) == X: mylist.append(elem) else: elem += random.random() self.score(elem) result.append(mylist) return result Transformation -------------- def meth(self, data, int=int, X=20) _rr = random.random _ss = self.score result = [] _ra = result.append for sub in data: mylist = [] _ma = mylist.append for elem in sub: if int(elem) == X: _ma(elem) else: elem += _rr() _ss(elem) _ra(mylist) return result Adding the decorator line is sufficient to localize everything in the function, eliminate all lookups from the inner loop, and still keep the code clean and readable. with-this-much-weirdness-who-needs-to-go-pysco-ly yours, Raymond Hettinger
Guido van Rossum
While I ponder the decorator syntax, let's propose some built-in decorators.
[...]
We could also add a standard implementation of synchronized. Or perhaps that should be imported from threading. (But is that really a good thing to copy from Java?)
I don't want to sound FUDdy, but I was under the impression that people didn't think this is something we actually want...
Other possibilities (all these are pretty much thinking aloud and very much up for votes; and not original, I've seen these proposed before!):
[...]
overrides -- indicates that this overrides a base class method. Maybe the default metaclass could check that if this is used there actually is a corresponding base class method, and we might have a "strict" metaclass that checks this is set for all overriding methods.
I don't get the point of this.
I'm still torn whether to promote defining properties this way:
[propget] def x(self): "Doc string for x" return self.__x
[propset] def x(self, newx): self.__x = newx
[propdel] def x(self): del self.__x
but if people like this (whatever the decorator syntax :) we might as well make this the recommended way to define properties.
- "a bit" I'm not sure I like the idea of promoting something that uses sys._getframe.
Should there be a a separate module from which all those decorators are imported, or should we make them built-ins, following the trend set by classmethod etc.?
I'm not sure any of the above are important enough to be builtins. funcattrs, maybe, but there's sod all reason to implement that in C, and getting Python implementations into the builtins seems likely to be pain. Could do it in site.py, I guess. Cheers, mwh -- Also, remember to put the galaxy back when you've finished, or an angry mob of astronomers will come round and kneecap you with a small telescope for littering. -- Simon Tatham, ucam.chat, from Owen Dunn's review of the year
We could also add a standard implementation of synchronized. Or perhaps that should be imported from threading. (But is that really a good thing to copy from Java?)
I don't want to sound FUDdy, but I was under the impression that people didn't think this is something we actually want...
Hence my parenthetical remark. But given that it is quite simple to do using decorators, surely *someone* will implement it. How useful it will be remains to be seen. Maybe someone should take this idea, run with it, and report how it enhanced their project?
overrides -- indicates that this overrides a base class method. Maybe the default metaclass could check that if this is used there actually is a corresponding base class method, and we might have a "strict" metaclass that checks this is set for all overriding methods.
I don't get the point of this.
First, it's sometimes useful to know when subclassing which methods you are overriding and which you are adding -- this may help the reader understanding what's going on. And then of course it would be useful to have automatic verification of this information so that the reader can actually believe it.
- "a bit"
I'm not sure I like the idea of promoting something that uses sys._getframe.
Me neither.
Should there be a a separate module from which all those decorators are imported, or should we make them built-ins, following the trend set by classmethod etc.?
I'm not sure any of the above are important enough to be builtins. funcattrs, maybe, but there's sod all reason to implement that in C, and getting Python implementations into the builtins seems likely to be pain. Could do it in site.py, I guess.
It would be simple enough in C. --Guido van Rossum (home page: http://www.python.org/~guido/)
At 09:07 AM 4/3/04 -0800, Guido van Rossum wrote:
We could also add a standard implementation of synchronized. Or perhaps that should be imported from threading. (But is that really a good thing to copy from Java?)
I don't want to sound FUDdy, but I was under the impression that people didn't think this is something we actually want...
Hence my parenthetical remark.
But given that it is quite simple to do using decorators, surely *someone* will implement it. How useful it will be remains to be seen. Maybe someone should take this idea, run with it, and report how it enhanced their project?
For what it's worth, Jim Fulton's ExtensionClass package has had a "synchronized" metaclass implemented in C since about... 1997? I don't know of anybody actually using it, though. For my few dabblings into objects being shared across threads, though, I've typically needed rather precise control over when things get locked and unlocked and how, so I've never bothered implementing a synchronizing decorator.
Guido van Rossum wrote:
I'm still torn whether to promote defining properties this way:
[propget] def x(self): "Doc string for x" return self.__x
[propset] def x(self, newx): self.__x = newx
[propdel] def x(self): del self.__x
but if people like this (whatever the decorator syntax :) we might as well make this the recommended way to define properties.
Does that actually work? I.e. is there an implementation of propget, propset, propdel so that this code introduces a property x? My understanding is that above syntax would be short for> [propget] def x(self): "Doc string for x" return self.__x x = propget(x) def x(self, newx): self.__x = newx x = propset(x) def x(self): del self.__x x = propdel(x) Later assignments to x would override earlier ones, so that only the propdel survives. Regards, Martin
At 06:15 PM 4/3/04 +0200, Martin v. Löwis wrote:
Guido van Rossum wrote:
I'm still torn whether to promote defining properties this way: [propget] def x(self): "Doc string for x" return self.__x [propset] def x(self, newx): self.__x = newx [propdel] def x(self): del self.__x but if people like this (whatever the decorator syntax :) we might as well make this the recommended way to define properties.
Does that actually work? I.e. is there an implementation of propget, propset, propdel so that this code introduces a property x?
Yes, using sys._getframe().f_locals[function.func_name]. Someone posted a link to an implementation earlier this week.
My understanding is that above syntax would be short for> [propget] def x(self): "Doc string for x" return self.__x x = propget(x)
def x(self, newx): self.__x = newx x = propset(x)
def x(self): del self.__x x = propdel(x)
Later assignments to x would override earlier ones, so that only the propdel survives.
Technically, what you show is not the actual expansion of the new syntax. The new syntax applies decorators before binding 'x' to the new function. So, the old value of 'x' is available to a decorator via sys._getframe().f_locals. This technique is also useful for implementing generic functions and/or multimethods, signature-based overloading, etc.
Phillip> Technically, what you show is not the actual expansion of the Phillip> new syntax. The new syntax applies decorators before binding Phillip> 'x' to the new function. So, the old value of 'x' is available Phillip> to a decorator via sys._getframe().f_locals. This technique is Phillip> also useful for implementing generic functions and/or Phillip> multimethods, signature-based overloading, etc. How would this be interpreted? x = 42 def x(self) [propget]: "Doc string for x" return self.__x def x(self, newx) [propset]: self.__x = newx def x(self) [propdel]: del self.__x That is, there is already an (otherwise invalid) 'x' in the calling scope when propget() is called. Do the property doodads just need to be bulletproofed or should an exception be raised? Skip
How would this be interpreted?
x = 42
def x(self) [propget]: "Doc string for x" return self.__x
def x(self, newx) [propset]: self.__x = newx
def x(self) [propdel]: del self.__x
That is, there is already an (otherwise invalid) 'x' in the calling scope when propget() is called. Do the property doodads just need to be bulletproofed or should an exception be raised?
It's broken. I expect this to raise an exception at some point. Beyond that, who cares? --Guido van Rossum (home page: http://www.python.org/~guido/)
At 03:24 PM 4/3/04 -0600, Skip Montanaro wrote:
That is, there is already an (otherwise invalid) 'x' in the calling scope when propget() is called. Do the property doodads just need to be bulletproofed or should an exception be raised?
If I recall correctly, the implementation someone contributed earlier this week (sorry I don't remember his name) raised an exception if the previous binding of the function name was not an instance of 'property'.
I'm still torn whether to promote defining properties this way:
[propget] def x(self): "Doc string for x" return self.__x
[propset] def x(self, newx): self.__x = newx
[propdel] def x(self): del self.__x
but if people like this (whatever the decorator syntax :) we might as well make this the recommended way to define properties.
Does that actually work? I.e. is there an implementation of propget, propset, propdel so that this code introduces a property x?
Yes, but it involves sys._getframe().
My understanding is that above syntax would be short for> [propget] def x(self): "Doc string for x" return self.__x x = propget(x)
def x(self, newx): self.__x = newx x = propset(x)
def x(self): del self.__x x = propdel(x)
Later assignments to x would override earlier ones, so that only the propdel survives.
No, the semantics of decorators are that the function name isn't actually assigned to for the first time until all decorators have been called. So the previous binding of that name is available to devious decorators like these. (This wasn't my idea -- someone else proposed it here first. :) --Guido van Rossum (home page: http://www.python.org/~guido/)
My use cases are at the bottom, but a couple comments first On Tue, Mar 30, 2004 at 02:58:41PM -0800, Guido van Rossum wrote:
The most obvious issues I see with this is are that: - grepping for "def func" won't show any sign that its decorated, and
Seems rather minor -- there's only so much you can cram on a single line. see below, in my app I would never hit the 80 char mark w/ inline decorators.
Neither of these are fatal flaws, but I think I still prefer:
def func(args) [decorator]:
Which hides 'classmethod' behind the more voluminous stuff, and that's my main gripe.
I don't think 'classmethod' is more important than the function def, def func(cls) [staticmethod]: pass is obviously wrong by convention and very easy to spot by eye or with a future version of pychecker. def [classmethod] func(cls): pass I read this as redundant, and grepping (eye and machine) just got much harder. Here is my use case, this is a production application written in python. It isn't a framework so there aren't many instances of decorators. 27k lines of python 371 classes 1381 functions (includes class and module functions) function decorators, staticmethod 12 classmethod 4 memoize 1 class decorators Factory.register 50 run_nightly 45 Here are some typical use cases out of that set done in my preferred style of decorators last. -- begin samples -- # staticmethods are almost always factory methods class Contact(Entity): def new_by_email(email) [staticmethod]: pass class CollectFactory(object): def class_ob(db_id, form_id) [staticmethod]: pass # but sometimes helpers in a namespace class Db(object): def client_database_name(client_id) [staticmethod]: return 'enduser_%d' % (client_id) # classmethods are a mixed bag class DataSet(object): table_name = 'foo' def drop_table(cls) [classmethod]: pass def create_table(cls) [classmethod]: pass class Collect(object): def validate_field_trans(cls) [classmethod]: pass """this is run in the test suite, it just makes sure any derivative classes are sane """ # and one memoize cache_here = {} def days_of_week(day_in_week) [memoize(cache_here)]: """for the passed in day, figure out what YEARWEEK it is and return a list of all the days in that week. """ # the class decorators all register with a factory of # their appropriate type. I currently do this with metaclasses class NewUsers(DataSet) [run_nightly]: pass # 45 of these # note, not all DataSet derivatives run_nightly, the current way # I get around this while using metaclasses is class NewUsers(DataSet): run_nightly = 0 # run_nightly, 1/4 of DataSets set this to false CF = CollectFactory class Newsletter(Collect) [CF.register(db=4, form_id=3)]: pass # about 50 of these # see CollectFactory.class_ob() above -- end samples -- There are still some spots where I will continue to use metaclasses instead of class decorators, namely where I want to keep track of all the leaf classes. But usually I want to decorate classes on a case-by-case basis. These numbers are only representative of one application. SPARK or other framework type things are going to use decorators a lot more pervasively. The number of people writing applications has to be much higher than the number of people writing frameworks. I'll use decorators whatever they look like, but they aren't more important to me than the function definition. And I don't want to have to relearn how to grep for them. So I prefer def foo(self, bar) [decorator1]: pass In my 27 tousand lines of python this will ALWAYS fit on one line and it won't change how I eye-grep or machine-grep for functions. -jackdied, BUFL (Benign User For Life)
"Guido van Rossum"
Another possibility that has been suggested is
[decorator] def func(arg, arg):
And one that I currently favor. I'm out of bandwidth to participate on a msg-by-msg basis, but perhaps folks can see if they can come to terms with this solution?
Taking the prefix position as given for the moment, why overload list literal syntax versus something currently illegal and meaningless? Such as [[decorator]] # easy enough to type, or <decorator> # almost as easy, or <<decorator>> Is there near precedent in other languages that I don't know of? Regardless of syntax, the way I would think of the process (ie, 'come to terms with it') is this: a string literal immediately after a function heading automagically sets the special __doc__ attribute. A list literal (if that is what is used) automagically set a special __postprocess__ attribute which is read once after the body is compiled and deleted when no longer needed. As I understand the proposal, the actual storage would be elsewhere in the interpreter, so the above would be 'as if'. But, would it be useful for later introspection to actual set and leave such an attribute that records the decorator processing of the function? Terry J. Reedy
On 2004-03-31, at 04.52, Terry Reedy wrote:
Taking the prefix position as given for the moment, why overload list literal syntax versus something currently illegal and meaningless? Such as
<decorator> # almost as easy, or
Yes. This looks better and will make it more clear that it's a special case. Otherwise the decorators will look too decoupled from the function.
Taking the prefix position as given for the moment, why overload list literal syntax versus something currently illegal and meaningless? Such as
<decorator> # almost as easy, or
Yes. This looks better and will make it more clear that it's a special case. Otherwise the decorators will look too decoupled from the function.
Why does <...> look better than [...]? To me, <...> just reminds me of XML, which is totally the wrong association. There are several parsing problems with <...>: the lexer doesn't see < and > as matched brackets, so you won't be able to break lines without using a backslash, and the closing > is ambiguous -- it might be a comparison operator. --Guido van Rossum (home page: http://www.python.org/~guido/)
On 2004-03-31, at 17.42, Guido van Rossum wrote:
Taking the prefix position as given for the moment, why overload list literal syntax versus something currently illegal and meaningless? Such as
<decorator> # almost as easy, or
Yes. This looks better and will make it more clear that it's a special case. Otherwise the decorators will look too decoupled from the function.
Why does <...> look better than [...]? To me, <...> just reminds me of XML, which is totally the wrong association.
There are several parsing problems with <...>: the lexer doesn't see < and > as matched brackets, so you won't be able to break lines without using a backslash, and the closing > is ambiguous -- it might be a comparison operator.
I'm not sure <decorator> is the best solution but it sure is better than [decorator]. Someone learning Python will probably be profoundly confused when seeing that special case. It also really feels like the decorators are decoupled from the function. The coupling between the decorators and the function would perhaps not appear to be greater with another syntax; but it would stand out. Unlike docstrings which are used a lot and which documentation and books explain early on, decorators won't be used that much. It would only make it harder to learn Python if someone looks through code and sees a decorator. Also: When explaining lists in a book you'd have to mention that "there are special cases which we will discuss in an addendum to chapter 27". Also, it'll make it so much harder to search for decorators if they are lists instead of a special syntax.
Why does <...> look better than [...]? To me, <...> just reminds me of XML, which is totally the wrong association.
I vote for << >>. The brackets and parens have too many meanings in Python already. <
> looks more like French than XML. ;)
<<...>> has the same practical problems as <...>: no automatic line breaking, >> is ambiguous. --Guido van Rossum (home page: http://www.python.org/~guido/)
On 2004-04-01, at 01.32, Guido van Rossum wrote:
Why does <...> look better than [...]? To me, <...> just reminds me of XML, which is totally the wrong association.
I vote for << >>. The brackets and parens have too many meanings in Python already. <
> looks more like French than XML. ;) <<...>> has the same practical problems as <...>: no automatic line breaking, >> is ambiguous.
So is automatic line breaking really necessary? Why not just wrap the actual decorators inside parentheses if you need to? Like: <<(long_decorator(attrs={'author':'somebody', 'frequency':'daily'}), another_decorator, one_more_decorator)>> <(long_decorator(attr={'author':'somebody', 'frequency':'daily'}), another_decorator, one_more_decorator)> Well ... it's not pretty. As for the ambiguity: If '[...]' can be handled for that special case, can't '<...>' or '<<...>>' also be handled? They would be much less ambiguous for the human interpreter.
Guido van Rossum wrote:
Why does <...> look better than [...]? To me, <...> just reminds me of XML, which is totally the wrong association.
I vote for << >>. The brackets and parens have too many meanings in Python already. <
> looks more like French than XML. ;) <<...>> has the same practical problems as <...>: no automatic line breaking, >> is ambiguous.
Sorry. I forgot that ">>" is also an operator. ;) Anyhow, I just meant that if you find a pair of characters that are illegal today then many of the objections about abusing existing syntax would go away. Paul Prescod
Anyhow, I just meant that if you find a pair of characters that are illegal today then many of the objections about abusing existing syntax would go away.
So how about *[...]* ? Or perhaps [*...*]? Or [|...|]? --Guido van Rossum (home page: http://www.python.org/~guido/)
On Thu, 2004-04-01 at 10:44, Guido van Rossum wrote:
Anyhow, I just meant that if you find a pair of characters that are illegal today then many of the objections about abusing existing syntax would go away.
So how about *[...]* ? Or perhaps [*...*]? Or [|...|]?
I'm -0 on whether it's necessary, but of these choices the latter seems okay. -Barry
On 2004-04-01, at 17.44, Guido van Rossum wrote:
Anyhow, I just meant that if you find a pair of characters that are illegal today then many of the objections about abusing existing syntax would go away.
So how about *[...]* ? Or perhaps [*...*]? Or [|...|]?
Also: Is it definite that it should be square brackets instead of parentheses? --- (sigh ... mail problems)
How about INDENT[...]:DEDENT? <0.5 serious>
Alas, won't work -- if the decoration is the first thing in a class body, you'd have something starting with double indent, but the lexer never gives you that. class C: [staticmethod] def foo(): pass This would be tokenized as 'class' 'C' ':' NEWLINE INDENT '[' 'staticmethod' ']' NEWLINE <ILLEGAL DEDENT> --Guido van Rossum (home page: http://www.python.org/~guido/)
On Thu, Apr 01, 2004, Guido van Rossum wrote:
So how about *[...]* ? Or perhaps [*...*]? Or [|...|]?
-0, +1, +0 (That last is mainly 'cause it's four pinky keystrokes, or I'd be +1 for it, too.) -- Aahz (aahz@pythoncraft.com) <*> http://www.pythoncraft.com/ "usenet imitates usenet" --Darkhawk
At 21:52 30.03.2004 -0500, Terry Reedy wrote:
[[decorator]] # easy enough to type, or
this already just valid syntax today
<decorator> # almost as easy, or <<decorator>>
Is there near precedent in other languages that I don't know of?
VB.NET uses <...> for .NET attributes instead of [...]. OTOH <...> is a usual syntax for parametrized types, e.g. C++ templates, Java,C# generics and others It seems VB.NET will use a syntax with (...) for that.
"Terry Reedy"
<decorator> # almost as easy, or <<decorator>>
Is there near precedent in other languages that I don't know of?
I want to caution against these. Turning >, which already parse as un-paired, into angle brackets caused embarassing parsing problems for C++. Now such things as A::foo<X> y; have to be written as A::template foo<X> y; ^^^^^^^^-----------keyword disambiguates parse in some cases. -- Dave Abrahams Boost Consulting www.boost-consulting.com
[Ping]
Another possibility that has been suggested is
[decorator] def func(arg, arg):
[Guido]
And one that I currently favor. I'm out of bandwidth to participate on a msg-by-msg basis, but perhaps folks can see if they can come to terms with this solution?
+0. I'm not convinced that Python actually needs decorators, but if I were that would be a +1; I certainly like this spelling better than the other ones.
On 03/30/04 16:21, Guido van Rossum wrote:
Another possibility that has been suggested is
[decorator] def func(arg, arg):
And one that I currently favor. I'm out of bandwidth to participate on a msg-by-msg basis, but perhaps folks can see if they can come to terms with this solution?
+1. We had a short email exchange about this a while ago. I'm glad it's back on the table. It's elegant, and the decorations people are already using will become more apparent than they are today. This is important to me because decorators need to be very visible. One class I frequently use (SimpleVocabulary in Zope 3) drove me crazy at first until I understood the pattern the author had used. The constructor takes two strange arguments, and for quite a while I couldn't figure out just what it wanted. Finally, I noticed that the class has several classmethods, and they all call the constructor. The author intended users to call the classmethods, not the constructor, but it was hard to notice any classmethods since the word "classmethod" was buried below the function body. Using "cls" as a first argument didn't help, since I've practically trained my eyes to ignore the first argument. Zope has experimented with several ways of decorating methods with security declarations. Here are some of the variations attempted so far: class Foo: __ac_permissions__ = (('bar', 'View management screens'),) def bar(self): pass InitializeClass(Foo) # Finds __ac_permissions__ and changes methods class Foo: bar__roles__ = PermissionRole('View management screens') def bar(self): pass class Foo: security = ClassSecurityInfo() security.declareProtected('View management screens', 'bar') def bar(self): pass InitializeClass(Foo) # Finds a ClassSecurityInfo and calls it These are all bad enough that Zope 3 has chosen to make no such declarations at all in Python code, putting them in XML instead. That may be the right choice for Zope 3, but surely other Python developers are running into similar needs on their own projects. They shouldn't have to go through this pain. They should be able to start with something clean like this: class Foo: [protected('View management screens')] def bar(self): pass Shane
I've produced a patch for my version: python.org/sf/926860 This patch (deco.diff) patches compile.c to recognize the following form of decorators: [list_of_expressions] def func(args): ... The list of expressions should contain at least one element and should not be a list comprehension, otherwise no special treatment is taken. (An empty list has no effect either way.) There's a simple test suite, Lib/test/test_decorators.py. I don't expect to have time to discuss this until tonight. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido:
This patch (deco.diff) patches compile.c to recognize the following form of decorators:
[list_of_expressions] def func(args): ...
I need a reality check here. Are you saying *this* is what you currently favour? I hate this. It's unspeakably horrible. Please tell me I'm having a nightmare... Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
On Wed, 2004-03-31 at 23:30, Greg Ewing wrote:
Guido:
This patch (deco.diff) patches compile.c to recognize the following form of decorators:
[list_of_expressions] def func(args): ...
I need a reality check here. Are you saying *this* is what you currently favour?
I hate this. It's unspeakably horrible. Please tell me I'm having a nightmare...
There's got to be a better way for people to share their preference on matters like these. Maybe we should create a Wiki page where people can sign up for or against any particular syntax. It would be easier to keep track of than trying to read all the messages in this thread. Jeremy PS Unspeakably horrible is a minor objection on the Internet. It's not really bad unless its untype-ably bad.
Jeremy> PS Unspeakably horrible is a minor objection on the Internet. Jeremy> It's not really bad unless its untype-ably bad. I take it the test for "untype-ably bad" is you check to see if you have to wash your keyboard (or buy a new one) after typing such a construct. <wink> Skip
Guido van Rossum
Another possibility that has been suggested is
[decorator] def func(arg, arg):
And one that I currently favor. I'm out of bandwidth to participate on a msg-by-msg basis, but perhaps folks can see if they can come to terms with this solution?
Seems like it would be painful to implement. Obviously that's not a killer blow, but: If the implementation is hard to explain, it's a bad idea. Cheers, mwh -- If design space weren't so vast, and the good solutions so small a portion of it, programming would be a lot easier. -- maney, comp.lang.python
Seems like it would be painful to implement.
Not at all. Stay tuned. :-) --Guido van Rossum (home page: http://www.python.org/~guido/)
On Tue, 2004-03-30 at 16:21, Guido van Rossum wrote:
Another possibility that has been suggested is
[decorator] def func(arg, arg):
And one that I currently favor. I'm out of bandwidth to participate on a msg-by-msg basis, but perhaps folks can see if they can come to terms with this solution?
I don't like it. It already has a meaning (albeit fairly useless) and it doesn't seem obvious from just looking at it that the decorator is connected to the following method. It doesn't taste Pythonic to me. -Barry
Regarding [decorator] def func(arg, arg): # stuff... On Wednesday 31 March 2004 10:41 pm, Barry Warsaw wrote:
I don't like it. It already has a meaning (albeit fairly useless) and it doesn't seem obvious from just looking at it that the decorator is connected to the following method. It doesn't taste Pythonic to me.
Whether or not we're arbiters of what's Pythonic, this syntax is quite sad, though I'll grant that it's better than variations along the line of decorate: decorator def func(arg, arg): # stuff... I think Phillip Eby's observation that people who want to use decorators want something different is quite telling. I'm with Phillip in preferring def func(arg, arg) [decorator]: # stuff... -Fred -- Fred L. Drake, Jr. <fdrake at acm.org> PythonLabs at Zope Corporation
On Wed, Mar 31, 2004 at 10:41:00PM -0500, Barry Warsaw wrote:
On Tue, 2004-03-30 at 16:21, Guido van Rossum wrote:
Another possibility that has been suggested is
[decorator] def func(arg, arg):
And one that I currently favor. I'm out of bandwidth to participate on a msg-by-msg basis, but perhaps folks can see if they can come to terms with this solution?
I don't like it. It already has a meaning (albeit fairly useless) and it doesn't seem obvious from just looking at it that the decorator is connected to the following method. It doesn't taste Pythonic to me.
Me too, but no one contrasted my actual use cases with any in any other format, so I must be missing something fundamental. I support an all Guido ticket now and forever, but I don't get the pulling-the-rabbit-out-of-a-hat decorator syntax anymore than the at-sign assignment. Something in the water at PyCon? -jackdied
>> > [decorator] >> > def func(arg, arg): Barry> I don't like it. It already has a meaning (albeit fairly Barry> useless) and it doesn't seem obvious from just looking at it that Barry> the decorator is connected to the following method. It doesn't Barry> taste Pythonic to me. I'm with Barry. It seems magic to me. If I write def f(): "doc" [47] return 3 today, although a bit weird, [47] doesn't affect the following statement in any way. Now the proposal (still, assuming this isn't an elaborate AFJ) on the table means to change the semantics of an unassigned list expression in one special case. I suggest: Special cases aren't special enough to break the rules. though I know someone will follow with Although practicality beats purity. IN this case there are other practical proposals on the table, not the least of which is the status quo. Skip
On Thu, 2004-04-01 at 08:43, Skip Montanaro wrote:
IN this case there are other practical proposals on the table, not the least of which is the status quo.
So I hacked up python-mode a bit to support the syntax coloring of Guido's previously MFS (most favored syntax). I wanted to see if the concerns about visual obscurity were real or not. Then I rewrote a few methods of mine that would benefit from decorators (with some elaboration). A screen shot of my XEmacs buffer is here: http://barry.warsaw.us/xemacs.png Now, I'm sure most of you will hate my color scheme, but I'm used to it so for me, the decorator stands out perfectly fine. I'd have no problem supporting decorator-before-colon and vastly prefer it to decorator-before-def. You can play with this yourself by checking out the latest version of python-mode.el (from sf.net/projects/python-mode). Note that you won't get the nice colorization on multiline decorators on the fly because XEmacs doesn't handle as-you-type font-locking across newlines well. Re-fontifying the buffer will pick up the right colors though. -Barry
Jeremy Hylton wrote:
Another possibility that has been suggested is [decorator] def func(arg, arg):
The decorator before def spelling is familiar from C#, and looks the nicest to me. However, I'm curious about the way decorators differ from docstrings to explain putting one kind of metadata shorthand before def and the other after. [decorator] def func(arg, arg): """Docstring""" pass def func(arg, arg): [decorator] """Docstring""" pass If there will only ever be two of these kinds of things, and a consensus is to put them on lines before and after the def, that would seem to be reason enough for decorator before def. If someday there were other types of metadata distinct from decorators, perhaps immutable and pertaining to secure execution, the before-docstring spelling might allow for more consistent additions to the shorthand. def func(arg, arg): [decorator] (otherator) """Docstring""" pass
If you spell decorators this way: [decorator] def func(): pass then what will happen when you type [decorator] at an interactive interpreter prompt? >>> [decorator] Will it type ... and wait for you to say more? Or will it evaluate the single-element list whose element is the value of the variable ``decorator'' and print the result? If it types ... and waits for you to say more, will do so for any list? Or will it somehow figure out which lists represent decorators?
If you spell decorators this way:
[decorator] def func(): pass
then what will happen when you type [decorator] at an interactive interpreter prompt?
[decorator]
Will it type ... and wait for you to say more? Or will it evaluate the single-element list whose element is the value of the variable ``decorator'' and print the result?
The latter. You can't add a decorator to a top-level function in interactive mode unless you put it inside an if. Check out the behavior of the patch I posted (926860). --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum
[decorator]
Will it type ... and wait for you to say more? Or will it evaluate the single-element list whose element is the value of the variable ``decorator'' and print the result?
The latter. You can't add a decorator to a top-level function in interactive mode unless you put it inside an if.
That seems like a very odd special case to me. Is it worth it? -- Dave Abrahams Boost Consulting www.boost-consulting.com
On Wed, 31 Mar 2004, Guido van Rossum wrote:
[decorator]
Will it type ... and wait for you to say more? Or will it evaluate the single-element list whose element is the value of the variable ``decorator'' and print the result?
The latter. You can't add a decorator to a top-level function in interactive mode unless you put it inside an if.
This discussion is turning in a direction that alarms me. Guido, i beg you to reconsider. Up to this point, we've all been talking about aesthetics -- whether it's prettier to have the list of decorators here or there, reads more naturally, visually suits the semantics, and so on. Putting the [decorator] on a separate line before the function changes the stakes entirely. It sets aside real functional issues in favour of aesthetics. Being enamoured with the way this syntax *looks* does not justify functionally *breaking* other things in the implementation to achieve it. Consider the arguments in favour: 1. Decorators appear first. 2. Function signature and body remain an intact unit. 3. Decorators can be placed close to the function name. These are all aesthetic arguments: they boil down to "the appearance is more logical". Similar arguments have been made about the other proposals. Consider the arguments against: 1. Previously valid code has new semantics. 2. Parse tree doesn't reflect semantics. 3. Inconsistent interactive and non-interactive behaviour. [*] 4. Decorators can be arbitrarily far away from the function name. These are all functional arguments, with the exception of 4. People are pointing out that the way Python *works* will be compromised. Putting the [decorator] on a separate preceding line: 1. violates a fundamental rule of language evolution, 2. makes it impossible to write an unambiguous grammar for Python, and 3. stacks the odds against anyone trying to learn decorators. I don't think there is any dispute about functionality. Everyone agrees that the interactive interpreter should reflect non-interactive behaviour as much as possible; everyone agrees that previously valid code should not change semantics; everyone agrees that the parse tree should reflect the semantics. Your responses to these points have been of the form "Well, it's not *so* bad. Look, we can hack around it..." Those aren't positive arguments. Those are excuses. Aesthetic and functional arguments are at two entirely different scales. To persist in finding excuses for one option because it is subjectively good-looking, while setting aside objective functional deficiences, is outside the bounds of reason. Special cases aren't special enough to break the rules. Please, i implore you, consider a different option on this one. Thanks for reading, -- ?!ng [*] Please don't say "Oh, the interactive and non-interactive modes are inconsistent already." That's not a reason to make it worse. Having the interactive interpreter close an "if:" block when you enter a blank line doesn't make "if:" totally unusable. But interpreting decorators inconsistently fails in a way that the decorator doesn't work at all, and provides no hint as to how you would make it work. This is especially bad since decorators are a weird and wonderful feature and people are going to have to experiment with them in the interactive interpreter as part of the learning process. Every such inconsistency makes Python harder to learn, and in particular this will make decorators especially painful to learn.
On Wed, Mar 31, 2004, Ka-Ping Yee wrote:
Special cases aren't special enough to break the rules.
Please, i implore you, consider a different option on this one.
+1 -- Aahz (aahz@pythoncraft.com) <*> http://www.pythoncraft.com/ "usenet imitates usenet" --Darkhawk
Ka-Ping Yee
Putting the [decorator] on a separate line before the function changes the stakes entirely. It sets aside real functional issues in favour of aesthetics.
Not only that, but I think the aesthetics of this version are *worse* than anything that's been considered before. It strikes me as deeply wrong and un-Pythonic -- so much so that I can't understand why Guido is even considering it. Was he abducted by aliens the other night and given a personality transplant or something? It's the only plausible explanation I can think of. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
At 01:23 PM 4/5/04 +1200, Greg Ewing wrote:
Ka-Ping Yee
: Putting the [decorator] on a separate line before the function changes the stakes entirely. It sets aside real functional issues in favour of aesthetics.
Not only that, but I think the aesthetics of this version are *worse* than anything that's been considered before. It strikes me as deeply wrong and un-Pythonic -- so much so that I can't understand why Guido is even considering it.
There are different kinds of aesthetics. Guido's proposal has grown on me from a *visual* aesthetics point of view. After I worked with it a little bit, I realized it really is much prettier than decorators-before-colon. However, from more intellectual aesthetics (consistency, predictability, etc.) I still dislike it, and don't really see how to sanely reconcile it with the Python syntax of today. I wish that I did, because it really does *look* better for simple decorators. On the other hand, *no* syntax proposed so far has been really that nice to look at when used for multiple decorators.
There are different kinds of aesthetics. Guido's proposal has grown on me from a *visual* aesthetics point of view. After I worked with it a little bit, I realized it really is much prettier than decorators-before-colon.
Right. It really is.
However, from more intellectual aesthetics (consistency, predictability, etc.) I still dislike it, and don't really see how to sanely reconcile it with the Python syntax of today. I wish that I did, because it really does *look* better for simple decorators.
I, too, wish that there were a way to make it work with current expectations. The * prefix looks so arbitrary: why not /, why not @, etc...
On the other hand, *no* syntax proposed so far has been really that nice to look at when used for multiple decorators.
Yet, decorator-before-def does better there, too, because it gives you more horizontal space to work with. This is useful for decorators that take argument lists, like the ObjC decorator and I believe some of PEAK's decorators. (I don't want to use funcattrs() as an argument here, because I believe that it is simply an inevitable by-product of introducing any kind of decorator syntax -- while far from perfect, in terms of readability setting function attributes in a decorator is so much better than setting them after the function has been defined, that I believe we have no choice but to provide it. (Similar for synchronized, except I feel less pressure to add it; I'd much rather introduce some kind of general block wrapper feature like we have discussed many times here.)) I also note that accepting decorator-before-colon now would make it harder to come up with a decent syntax for declaring the return type, which I still want to do in some future version of Python with optional (!) static typing. But not impossible -- there's enough punctuation available besides '[' and ':'. I also note that the proposed variants that prefix the 'def clause' with another (optional) indented clause, started e.g. by 'with:' or 'decorate:', look much worse. They violate another strong expectation in Python: that a suite is a sequence of statements. (And it really can't be a sequence of statements -- e.g. assignments just really don't make sense as decorators, nor do many other forms of statement.) One worry about every syntax that supports multiple decorators: as soon as there's a transformation among the decorators, they stop being commutative. And that could cause a load of problems with beginners trying their hand at using complicated decorators (e.g. trying to define a class property by combining classmethod and property). --Guido van Rossum (home page: http://www.python.org/~guido/)
On Mon, 2004-04-05 at 05:33, Guido van Rossum wrote:
I, too, wish that there were a way to make it work with current expectations. The * prefix looks so arbitrary: why not /, why not @, etc...
What about: [as classmethod] def foo(bar, baz): pass To me this is more obvious (and less like p*rl) that *[classmethod], and it is also currently a syntax error so won't break existing code. Mark Russell
Mark Russell
On Mon, 2004-04-05 at 05:33, Guido van Rossum wrote:
I, too, wish that there were a way to make it work with current expectations. The * prefix looks so arbitrary: why not /, why not @, etc...
What about:
[as classmethod] def foo(bar, baz): pass
To me this is more obvious (and less like p*rl) that *[classmethod], and it is also currently a syntax error so won't break existing code.
Not bad. Also tolerable: as [classmethod] def foo(bar, baz): pass -- Dave Abrahams Boost Consulting www.boost-consulting.com
On Mon, 2004-04-05 at 11:38, David Abrahams wrote:
Not bad. Also tolerable:
as [classmethod] def foo(bar, baz): pass
The trouble with that is that its less clearly an error at the moment (e.g.: as = [ 'pathological' ] classmethod = 0 as [classmethod] def foo(bar, baz): pass is actually legal right now). The nice thing about "[as xxx]" is that there's no way you can confuse it with a subscript expression. Mark
[I've mentioned this idea before, but don't recall seeing any responses to it; so I'm mentioning it again in case it got lost the first time] I am concerned about putting decorators before the function definition because of the possibility of such code quietly sneaking through existing compilers: [classmethod] def foo(bar, baz): pass Guido is concerned about putting decorators after the function definition because there may be a lot of them and it may be hard to read: def foo(bar, baz) \ [staticmethod, classmethod, otherattributes(debug=True}) }: pass Here's what I don't understand. I imagine that most of the time, when someone decorates a function with lots of attributes, there is at least the possibility of decorating more than one function with the same set of attributes. In such a case, doesn't it make sense to bind a variable to the attributes and use it instead? Attributes = [staticmethod, classmethod, otherattributes(debug=True)] def foo(bar, baz) Attributes: pass And doesn't this idea answer the objection that too many attributes after the parameter list make the definition hard to read? For that matter, why do we need the brackets if there is only one attribute: def foo(bar, baz) staticmethod: pass I am suggesting that what comes between the ) and the : should be an expression, which must evaluate to either a callable or a sequence of callables. For that matter, why not allow a tuple expression without parentheses: def foo(bar, baz) staticmethod, classmethod: pass Whatever sequence you put there, I think the semantics are clear: Before binding a name to the function, pass it to the callable or in turn to each element of the sequence.
On Monday 2004-04-05 at 15:42, Andrew Koenig wrote:
Here's what I don't understand.
I imagine that most of the time, when someone decorates a function with lots of attributes, there is at least the possibility of decorating more than one function with the same set of attributes.
In such a case, doesn't it make sense to bind a variable to the attributes and use it instead?
Attributes = [staticmethod, classmethod, otherattributes(debug=True)]
def foo(bar, baz) Attributes: pass
And doesn't this idea answer the objection that too many attributes after the parameter list make the definition hard to read?
It certainly answers that objection, but it has (so it seems to me) other problems of its own; see below.
For that matter, why do we need the brackets if there is only one attribute:
def foo(bar, baz) staticmethod: pass
I am suggesting that what comes between the ) and the : should be an expression, which must evaluate to either a callable or a sequence of callables. For that matter, why not allow a tuple expression without parentheses:
def foo(bar, baz) staticmethod, classmethod: pass
Whatever sequence you put there, I think the semantics are clear: Before binding a name to the function, pass it to the callable or in turn to each element of the sequence.
The brackets serve multiple purposes. - They make the decoration *look*, even to novice eyes, like an annotation: "by the way, please make this a static method". - They may possibly help the parser a little. (I'm showing my ignorance here; I haven't really looked at the Python parser at all.) - They give a hint to the user that what goes in between them may be a sequence, not just a single item. - They ensure that all decorated definitions have a somewhat consistent appearance. Omitting them breaks all these things, especially if an arbitrary expression is allowed there. And do we really want to allow def foo(bar, baz) quux(wibble, spong): pass That's not a hideous pathological case; it's what a simple decoration using a parameterized decorator will look like without the brackets. With good decorator names I suppose it becomes somewhat comprehensible: def foo(bar, baz) memoized(25): pass but surely it's still much clearer when written as def foo(bar, baz) [memoized(25)]: pass or even (though I'm a little tempted to agree with whoever it was that was wondering whether Guido has been abducted by aliens) [memoized(25)] def foo(bar, baz): pass We can keep the syntactic distinctiveness while still allowing multiple decorators to be combined, by having a function (a builtin, perhaps, but it's not hard to write) that composes decorators: def compose(*decorators): def _(f): for d in decorators: f = d(f) return f our_attributes = compose(staticmethod, classmethod, otherattributes(debug=True)) def foo(bar, baz) [our_attributes]: pass -- Gareth McCaughan
Andrew Koenig
For that matter, why do we need the brackets if there is only one attribute:
def foo(bar, baz) staticmethod: pass
Because the decorator doesn't stand out enough that way. It's a matter of opinion, but I think it looks better with the brackets, even if there is only one decorator. The brackets mean "here is something additional", not "here is a list". Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
On Mon, 2004-04-05 at 11:38, David Abrahams wrote:
Not bad. Also tolerable:
as [classmethod] def foo(bar, baz): pass
I thought we decided long ago that we can't use a keyword because no single keyword could sound right in the context of all possible use cases. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
Mark Russell wrote:
What about:
[as classmethod] def foo(bar, baz): pass
To me this is more obvious (and less like p*rl) that *[classmethod], and it is also currently a syntax error so won't break existing code.
I agree, it's nice. You could even use this syntax at the interactive prompt. The "is" keyword is another choice, too. Shane
On Mon, 2004-04-05 at 11:50, Shane Hathaway wrote:
Mark Russell wrote:
What about:
[as classmethod] def foo(bar, baz): pass
To me this is more obvious (and less like p*rl) that *[classmethod], and it is also currently a syntax error so won't break existing code.
I agree, it's nice. You could even use this syntax at the interactive prompt. The "is" keyword is another choice, too.
Since that is illegal syntax today, it would alleviate my primary concern with decorator-before-def syntax. And the 'as' keyword does help tie it to the following def. -Barry
What about:
[as classmethod] def foo(bar, baz): pass
Since that is illegal syntax today, it would alleviate my primary concern with decorator-before-def syntax. And the 'as' keyword does help tie it to the following def.
Indeed. And we could even extend it to allow a single set of decorators to apply to multiple function definitions, as follows: [as classmethod]: def foo(bar, baz): pass def bar(foo, baz): pass Of course, the colon would be required if and only if an indent follows it. (I still prefer decorators after the parameters, but this version pretty much solves the same problems and answers my non-aesthetic objections)
What about:
[as classmethod] def foo(bar, baz): pass
To me this is more obvious (and less like p*rl) that *[classmethod], and it is also currently a syntax error so won't break existing code.
I appreciate the suggestion, but the current parser can't handle that. --Guido van Rossum (home page: http://www.python.org/~guido/)
On Mon, 2004-04-05 at 17:44, Guido van Rossum wrote:
I appreciate the suggestion, but the current parser can't handle that.
Would it not be possible to cheat and make the lexer transform "[" WHITESPACE "as" WHITESPACE IDENTIFIER into an LBRACE-AS token (which I assume would make the parser's job simple). Mark
I appreciate the suggestion, but the current parser can't handle that.
Would it not be possible to cheat and make the lexer transform
"[" WHITESPACE "as" WHITESPACE IDENTIFIER
into an LBRACE-AS token (which I assume would make the parser's job simple).
I don't think without extreme hackery, but feel free to prove me wrong by producing a patch. (A problem is, there could be newlines and comments inside the WHITESPACE. Backtracking over that would require the lexer to have an arbitrary-length buffer.) --Guido van Rossum (home page: http://www.python.org/~guido/)
Mark Russell suggested:
Would it not be possible to cheat and make the lexer transform
"[" WHITESPACE "as" WHITESPACE IDENTIFIER
into an LBRACE-AS token (which I assume would make the parser's job simple).
Guido said:
I don't think without extreme hackery, but feel free to prove me wrong by producing a patch. (A problem is, there could be newlines and comments inside the WHITESPACE. Backtracking over that would require the lexer to have an arbitrary-length buffer.)
Another possibility might be to require the construct start without the whitespace: "[as" WHITESPACE ... "]" That avoids the buffering issue and matches what most programmer are more likely to write anyway. -Fred -- Fred L. Drake, Jr. <fdrake at acm.org> PythonLabs at Zope Corporation
On Mon, 2004-04-05 at 19:59, Fred L. Drake, Jr. wrote:
Another possibility might be to require the construct start without the whitespace:
"[as" WHITESPACE ... "]"
That avoids the buffering issue and matches what most programmer are more likely to write anyway.
I think that's probably too arbitrary a restriction. Another thought: presumably the parser could handle this syntax if "as" were a keyword. Is that true, and if so, is making "as" a keyword out of the question? The change could presumably be enabled by a "from __future__" directive to avoid breaking existing code. Mark
presumably the parser could handle this syntax if "as" were a keyword.
Not really. The parser is really dumb, it han only handle one alternative starting with a particular token at any point. Since '[' can already start an expression and an expression is a valid statement, anything else also starting with '[', even if it isn't a valid expression, cannot be accepted. You may be thinking of Yacc, which has much more liberal rules. --Guido van Rossum (home page: http://www.python.org/~guido/)
In article <200404052209.i35M9m606023@guido.python.org>,
Guido van Rossum
presumably the parser could handle this syntax if "as" were a keyword.
Not really. The parser is really dumb, it han only handle one alternative starting with a particular token at any point. Since '[' can already start an expression and an expression is a valid statement, anything else also starting with '[', even if it isn't a valid expression, cannot be accepted. You may be thinking of Yacc, which has much more liberal rules.
--Guido van Rossum (home page: http://www.python.org/~guido/)
Ok, then how about <decorator> def ... ? '<' can't start an expression or statement currently, can it? -- David Eppstein http://www.ics.uci.edu/~eppstein/ Univ. of California, Irvine, School of Information & Computer Science
On 2004-04-06, at 01.24, David Eppstein wrote:
Ok, then how about <decorator> def ... ?
'<' can't start an expression or statement currently, can it?
Yeah. I think that would look better. On the other hand ... On 2004-03-31, at 17.42, Guido van Rossum wrote:
Why does <...> look better than [...]? To me, <...> just reminds me of XML, which is totally the wrong association.
There are several parsing problems with <...>: the lexer doesn't see < and > as matched brackets, so you won't be able to break lines without using a backslash, and the closing > is ambiguous -- it might be a comparison operator.
In article <3AA96ED4-876E-11D8-A42F-0003934AD54A@chello.se>,
Simon Percivall
On 2004-04-06, at 01.24, David Eppstein wrote:
Ok, then how about <decorator> def ... ?
'<' can't start an expression or statement currently, can it?
Yeah. I think that would look better. On the other hand ...
On 2004-03-31, at 17.42, Guido van Rossum wrote:
Why does <...> look better than [...]? To me, <...> just reminds me of XML, which is totally the wrong association.
There are several parsing problems with <...>: the lexer doesn't see < and > as matched brackets, so you won't be able to break lines without using a backslash, and the closing > is ambiguous -- it might be a comparison operator.
Well, Mike Pall's long message "The Need for a Declarative Syntax" convinced me that some kind of bracketed prefix is the best location for these things, but I still don't like [...] because it already has a (useless) meaning, so declarators would pass silently in old Pythons. And the same objection applies to all other existing brackets. So, angle brackets seem like the cleanest remaining choice. Re the lexer, I don't see this as a big problem as long as multiple declarators can be handled by multiple <...> pairs. Re the ambiguity, I think comparisons shouldn't normally appear in declarators, so it's ok if they have to be parenthesized. -- David Eppstein http://www.ics.uci.edu/~eppstein/ Univ. of California, Irvine, School of Information & Computer Science
There are several parsing problems with <...>: the lexer doesn't see < and > as matched brackets, so you won't be able to break lines without using a backslash, and the closing > is ambiguous -- it might be a comparison operator.
Re the lexer, I don't see this as a big problem as long as multiple declarators can be handled by multiple <...> pairs.
...besides, how hard can it be to convince the lexer to treat '<' at the start of a non-continuation line differently from other placements of the same char? -- David Eppstein http://www.ics.uci.edu/~eppstein/ Univ. of California, Irvine, School of Information & Computer Science
On Mon, 2004-04-05 at 19:03, Guido van Rossum wrote:
Would it not be possible to cheat and make the lexer transform
"[" WHITESPACE "as" WHITESPACE IDENTIFIER
into an LBRACE-AS token (which I assume would make the parser's job simple).
I don't think without extreme hackery, but feel free to prove me wrong by producing a patch. (A problem is, there could be newlines and comments inside the WHITESPACE. Backtracking over that would require the lexer to have an arbitrary-length buffer.)
Unless I'm missing something (very possible) there's no need to backtrack over the whitespace - all you need is a two-entry stack for the "[" and the token that follows. There's no question of INDENT or DEDENT because we're inside a [] pair. I've put a patch at python.org/sf/932100 which does this, plus changes to Grammar/Grammar, compile.c etc - it implements the syntax: [as classmethod] def foo(cls): pass The newline is optional: [as foo, bar, baz] def foo(cls): pass is also legal. The test suite passes as before (there are still two failures but they are unrelated to this patch). The patch also includes your test_decorators.py, modified for the new syntax. Mark
Mark> I've put a patch at python.org/sf/932100 which does this, plus Mark> changes to Grammar/Grammar, compile.c etc - it implements the Mark> syntax: Mark> [as classmethod] Mark> def foo(cls): pass Mark> The newline is optional: Mark> [as foo, bar, baz] def foo(cls): pass Mark> is also legal. Does it treat '[' and 'as' as separate tokens so whitespace can occur between them? Note that the second alternative is likely to break all sorts of auxiliary tools which scan Python source. etags/ctags and python-mode come to mind. Clearly, python-mode is well within our domain, but other syntax coloring tools aren't. Skip
On Fri, 2004-04-09 at 01:36, Skip Montanaro wrote:
Does it treat '[' and 'as' as separate tokens so whitespace can occur between them?
Yes.
Note that the second alternative is likely to break all sorts of auxiliary tools which scan Python source. etags/ctags and python-mode come to mind. Clearly, python-mode is well within our domain, but other syntax coloring tools aren't.
It's a trivial change to disallow the second form - I'm pretty much 50/50 on whether it's a good idea to allow it or not. Probably "There's only one way to do it" should apply. (The change is trivial - in Grammar/Grammar, just replace [NEWLINE] with NEWLINE in the decorators entry). Mark
On Fri, 2004-04-09 at 05:44, Mark Russell wrote:
It's a trivial change to disallow the second form - I'm pretty much 50/50 on whether it's a good idea to allow it or not. Probably "There's only one way to do it" should apply. (The change is trivial - in Grammar/Grammar, just replace [NEWLINE] with NEWLINE in the decorators entry).
Personally, I think it's fine to allow them to be on the same line. I wouldn't want to arbitrarily disallow it because we're worried about how the tools will react. If people gravitate toward this syntax because it's more readable, then the tools will eventually adjust. just-because-you-can-doesn't-mean-you-should-ly y'rs, -Barry
"Phillip J. Eby"
There are different kinds of aesthetics. Guido's proposal has grown on me from a *visual* aesthetics point of view. After I worked with it a little bit, I realized it really is much prettier than decorators-before-colon.
Your tastes must be different from mine, then, because it doesn't strike me as any prettier visually, either. I don't think I can fully separate these different kinds of aesthetics in my mind, anyway. To me, a piece of syntax isn't just something to look at -- it has a grammar, and it has a meaning, and if the grammar and the meaning and the way it looks on the page don't all agree with each other, it strikes a discordant note. When I read the syntax def foo(args) [classmethod]: ... it says to me "Define a function foo, with these args, and which happens to be a classmethod. Here's the body..." On the other hand, when I see [classmethod] def foo(args): ... the little voice in my head doesn't really say anything coherent at all. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
On Mon, Apr 05, 2004, Ka-Ping Yee wrote:
On Mon, 5 Apr 2004, Greg Ewing wrote:
the little voice in my head doesn't really say anything coherent at all.
Funny. I experience that most days.
"I do what the voices in my head tell me to." -- Aahz (aahz@pythoncraft.com) <*> http://www.pythoncraft.com/ Why is this newsgroup different from all other newsgroups?
Greg Ewing
There are different kinds of aesthetics. Guido's proposal has grown on me from a *visual* aesthetics point of view. After I worked with it a little bit, I realized it really is much prettier than decorators-before-colon.
Your tastes must be different from mine, then, because it doesn't strike me as any prettier visually, either.
I don't think I can fully separate these different kinds of aesthetics in my mind, anyway. To me, a piece of syntax isn't just something to look at -- it has a grammar, and it has a meaning, and if the grammar and the meaning and the way it looks on the page don't all agree with each other, it strikes a discordant note.
When I read the syntax
def foo(args) [classmethod]: ...
it says to me "Define a function foo, with these args, and which happens to be a classmethod. Here's the body..."
On the other hand, when I see
[classmethod] def foo(args): ...
the little voice in my head doesn't really say anything coherent at all.
Lest we get the idea that there's any absolute measure of aesthetics: +1 restoring-nature's-equilibrium-ly, Dave -- Dave Abrahams Boost Consulting www.boost-consulting.com
Greg Ewing wrote:
"Phillip J. Eby"
: There are different kinds of aesthetics. Guido's proposal has grown on me from a *visual* aesthetics point of view. After I worked with it a little bit, I realized it really is much prettier than decorators-before-colon.
Your tastes must be different from mine, then, because it doesn't strike me as any prettier visually, either.
What people find readable depends mostly on what they are used to. The reason Python is "so readable" on first read is because it rips off so much syntax directly from C, the language that is the ancestor for all of the other languages people use today. C doesn't have function decorators so people don't have expectations for them. But they will when they start coming over from C# and Java 1.5 in two or three years. One virtue of Guido's proposal is that it is basically what C# does. Java uses a pretty different syntax but it is also a prefix syntax. If Python uses a postfix syntax it will probably be alone in making that choice. I'm not saying that Python has to do what the other languages do because they do it, but all else equal, being familiar is better than being idiosyncratic (i.e. different for no good reason). So I would rank "like C# and Java" higher than "fits my personal aesthetics in early 2004" because aesthetics are likely to drift towards C# and Java over time. Paul Prescod
What people find readable depends mostly on what they are used to. The reason Python is "so readable" on first read is because it rips off so much syntax directly from C, the language that is the ancestor for all of the other languages people use today.
Bizarre. I think most of the readability actually comes from places where I *didn't* rip off C, in particular Python's syntax for block structure, lists and dicts, and the lack of anything resembling C's abomination passing for type declarations (an mistake admitted by its author). Just goes to show that taste differs. :)
C doesn't have function decorators so people don't have expectations for them. But they will when they start coming over from C# and Java 1.5 in two or three years. One virtue of Guido's proposal is that it is basically what C# does. Java uses a pretty different syntax but it is also a prefix syntax. If Python uses a postfix syntax it will probably be alone in making that choice.
That's exactly Jim Hugunin's argument for this syntax.
I'm not saying that Python has to do what the other languages do because they do it, but all else equal, being familiar is better than being idiosyncratic (i.e. different for no good reason). So I would rank "like C# and Java" higher than "fits my personal aesthetics in early 2004" because aesthetics are likely to drift towards C# and Java over time.
Anyway, personal aesthetics are learned behavior, like it or not. --Guido van Rossum (home page: http://www.python.org/~guido/)
>> One virtue of Guido's proposal is that it is basically what C# >> does. Java uses a pretty different syntax but it is also a prefix >> syntax. If Python uses a postfix syntax it will probably be alone in >> making that choice. Guido> That's exactly Jim Hugunin's argument for this syntax. Since both C# and Java differ significantly in their typing from Python I suspect what works well for those languages (they already have a lot of declarative "baggage" because of their compile-time type checks, so what's a few more declarations?) may not work as well for Python. Second, Python has a strong tradition of borrowing what"works well" from other languages. I'm skeptical that C# has been around long enough to suggest that its syntax "works well". It's pretty clear that Microsoft is going to ram that down most programmers' throats, so the C# user base is no doubt going to be very large in a year or two. If what you're looking for is to provide a familiar syntactic base for C# refugees, then I suppose that's fine, but from the examples I've seen, C# decorations (annotations? attributes?) can be sprinkled quite liberally through the code (and serve as much to obscure as to highlight what's going on). A C# refugee might be disappointed to see that Python's decorators are limited to class, function and method declarations. With that in mind, it doesn't seem to me that partially mimicing C#'s decorator system is necessarily a good thing. Skip
Skip:
Second, Python has a strong tradition of borrowing what"works well" from other languages. I'm skeptical that C# has been around long enough to suggest that its syntax "works well".
C# inherited this syntax from Microsoft IDL and its predecessor ODL. Unsure of the age of MS IDL but a 1995 edition of "Inside OLE" uses it. I can't recall any problems with the syntax although fewer people would have worked with it when it was only for defining interfaces rather than as part of a general purpose language. Neil
On Mon, Apr 05, 2004, Paul Prescod wrote:
What people find readable depends mostly on what they are used to. The reason Python is "so readable" on first read is because it rips off so much syntax directly from C, the language that is the ancestor for all of the other languages people use today.
Speaking as someone whose first languages were BASIC and Pascal, both before C became so hugely popular, my feelings about Python's readability were more along the lines of it being different from C. Yes, I knew enough C to recognize where it was being ripped off -- but that's a far cry from the elegance of the language as a whole. -- Aahz (aahz@pythoncraft.com) <*> http://www.pythoncraft.com/ Why is this newsgroup different from all other newsgroups?
Paul Prescod
I'm not saying that Python has to do what the other languages do because they do it, but all else equal,
But all else isn't equal, at least not for me, because I like one better than the other. If I didn't care one way or the other, I might look to another language for a tiebreaker. But I don't need to, because I already know which one I prefer. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
Phillip J. Eby wrote:
On the other hand, *no* syntax proposed so far has been really that nice to look at when used for multiple decorators.
I agree completly here. Add to it that understanding the semantics of applying multiple decorators might not always be easy in case of a problem (which is a very common sitatuation for a programmer). Maybe this hints at thinking about syntaxes that don't neccesarily work for multiple decorators but work well for one decorator? After all the main point of introducing descriptor syntax seems to be that they should visually come closer to the 'def' statement line and not that multiple decorators are a common problem. cheers, holger
participants (35)
-
"Martin v. Löwis"
-
Aahz
-
Andrew Bennetts
-
Andrew Koenig
-
Barry Warsaw
-
David Abrahams
-
David Ascher
-
David Eppstein
-
Fred L. Drake, Jr.
-
Gareth McCaughan
-
Greg Ewing
-
Guido van Rossum
-
Holger Krekel
-
Jack Diederich
-
Jeff Kowalczyk
-
Jeremy Hylton
-
Josiah Carlson
-
Ka-Ping Yee
-
Kevin Jacobs
-
Mark Russell
-
Michael Hudson
-
Neal Norwitz
-
Neil Hodgson
-
Paul Prescod
-
Phillip J. Eby
-
Raymond Hettinger
-
Samuele Pedroni
-
Shane Hathaway
-
Shane Holloway (IEEE)
-
Simon Percivall
-
Skip Montanaro
-
Terry Reedy
-
Thomas Heller
-
Tim Peters
-
Walter Dörwald