Extending @ syntax to allow expressions

Hey everyone, I came across a case which *might* be a use case for a syntax extension, but I'm not sure. Wanted to get feedback from the group. *The extension: *Extend the decorator syntax from decorator ::= "@" dotted_name ["(" [argument_list [","]] ")"] NEWLINE to decorator ::= "@" expression NEWLINE where the expression must return a callable that accepts a function of the indicated signature as its argument. *The motivating case: *I'm using function decorators to define cron jobs in my system, with the rather nice syntax "@cronJob(parameters) def myFunction(standard args)". The decorator registers the function, this causes all sorts of magic to occur, and out the other end of the pipe jobs run in production. It's working very nicely. However, we'd like to improve the syntax to allow different cron configurations for the job in different production environments. Since the default-plus-overrides model is natural for the actual use cases, a nice way to do this would be to replace the cronJob function with a class whose __init__ and __call__ methods respectively load up the parameters and do the actual work of registration, but which also has an override() method that lets you specify per-environment overrides and then returns self. This would allow a syntax like: @CronJob('job-name', params...).override('dev', more-params...) def myFunction(...) However, this is invalid syntax in Python 3.8 because you can't have a general expression in a decorator statement. Instead, the nearest option is myJob = CronJob('job-name', params...).override('dev', more-params...) @myJob def myFunction(...) Which is uglier for no obvious benefit. *Possible pros and cons: * This extends the existing syntax in a way that's very intuitive w.r.t. the current one -- the docs say "A function definition may be wrapped by one or more decorator expressions. Decorator expressions are evaluated when the function is defined, in the scope that contains the function definition. The result must be a callable, which is invoked with the function object as the only argument. The returned value is bound to the function name instead of the function object. Multiple decorators are applied in nested fashion." It's not obvious from the syntax why the decorator expression *needs* to have this limited form. It increases consistency, by eliminating this one unusual use of dotted expressions as special relative to other expressions. (NB this is the only call to ast_for_dotted_expr in ast.c!) The meaning of the operation remains unambiguous, and is just as accessible to tools like linters, type checkers, and syntax highlighters, as the @ operator simply modifies the succeeding expression. On the downside, it's more flexible, and so offers more chances for a user to shoot themselves in the foot. It's somewhat of a corner case, so it's not obvious that the syntax extension is worth it. What do people think?

I think some idea like this might be worth proposing. the first idea that comes to my mind is to allow the name of a decorator to be an fstring using `@'...'` or `@"..."` syntax. If, for example, you have `method_type = 'class'`, then you could decorate a method using `@'{method_type}method'`.

On Mon, Oct 21, 2019, at 22:00, Steve Jorgensen wrote:
I'm not sure if this is a very good example (there's no "normalmethod" to return no decorator, and staticmethod typically needs a different function signature with no self/cls)... and for any nontrivial case I can imagine, it's taken care of by the ability to call a function. For example, for your case you can simply def m(method_type): if method_type == 'static': return staticmethod elif method_type == 'class': return classmethod elif method_type == 'normal': return lambda f: f else: # do what here? are you extending with additional "foomethod" decorators? and then do @m(method_type).

On Mon, Oct 21, 2019, at 22:30, Random832 wrote:
Sorry, when posting this, I hadn't seen Yonatan Zunger's original post yet, only this reply. I do see the utility for that suggestion, but not really for this one allowing a decorator to be a string that will be evaluated. [and if you *really* want yours literally, you could simply do @eval(f'{method_type}method'), or something else in case method_type may contain characters that are not part of an identifier.

had not thought of that. That actually does work. :) I would say that means there is no need for a new feature, but would it make sense for this idiom to be documented in a PEP or some other easily discoverable place?

On Mon, Oct 21, 2019 at 8:21 PM Yonatan Zunger <zunger@humu.com> wrote:
Hey everyone,
I came across a case which might be a use case for a syntax extension, but I'm not sure. Wanted to get feedback from the group.
[...]
@CronJob('job-name', params...).override('dev', more-params...) def myFunction(...)
Could you modify the signature of CronJob's __init__ method to accept override parameters as a dict or list in a keyword argument? Then this example would look something like this: @CronJob('job-name', params..., dev=more-params...) def myFunction(...) Internally, you could even keep the .override() method and have __init__ call that method to do the work. The signature of __init__ could then either include an explicit keyword argument (defaulting to None) for each environment you need to account for, or if there are too many, a **kwargs catch-all from which you extract what you need.

On 21Oct2019 17:18, Yonatan Zunger <zunger@humu.com> wrote:
Personally, I have 2 concerns. 1: this introduces a lot of scope for arbitrary complexity after the @ versus the extremely readable "name(...)" form. 2: I'm not sure what this would to to uses of "@" as an operator, as has been suggested various times for various laudable reasons; remember that an @decorator or other function definition is just another statement, and arbitrary expressions are already statements.
Maybe a better form, already supported, might be this: @CronJob('job-name', params...) @cron_override('dev', more-params...) def the_function... i.e. decorate the decorated function. I would choose to write it like the above, even though the decorators run from inside to out, amounting to having @cron_override prepare an environment specific decoration and @CronJob just be the "default for an otherwise unspecified environment" decoration. No new syntax needed. And it reads nicely, at least to my eye. You will probably run into some resistance if there's no case for your syntax which can't be addressed with the nested decoration above (or something equivalent - the point here isn't that what I've written above is a great thing (though I like it) but that it requires no new syntax). Cheers, Cameron Simpson <cs@cskk.id.au>

On Oct 21, 2019, at 19:53, Cameron Simpson <cs@cskk.id.au> wrote:
I think this discussed when decorators were first proposed, and the consensus wasn’t “yuck” but rather “let’s see if we need it, and only worry about it when it comes up”. If so, that’s a small point in favor of your proposal, so you might want to search the original discussion.
Personally, I have 2 concerns. 1: this introduces a lot of scope for arbitrary complexity after the @ versus the extremely readable "name(...)" form.
This seems like a consenting-adults thing. Any flexible syntax can be abused to write unreadable code, but unless there’s a good reason to suspect it will be abused often (and/or won’t be used helpfully very often) that’s not much of an argument against it. I don’t think many people would write horrible arbitrary decorator expressions, except for the kind of people who already do horrible unpythonic things like writing a ten-line function as a lambda just to assign it to a variable anyway.
2: I'm not sure what this would to to uses of "@" as an operator, as has been suggested various times for various laudable reasons; remember that an @decorator or other function definition is just another statement, and arbitrary expressions are already statements.
I don’t understand this. @ already exists as an operator, and already takes arbitrary expressions for the left and right operands, with no parser ambiguity. What future worthwhile suggestions to that existing syntax are you imagining that might break that? Even if you’re imagining that people might want @ to be a unary prefix operator as well as a binary operator (like + and -), how does restricting decorator syntax help ambiguity there? Surely you’d want the @ operator to be able to take a dotted-name operand.

On 21Oct2019 20:41, Andrew Barnert <abarnert@yahoo.com> wrote:
Maybe so. I still have personal reluctance to open such a door without a good reason.
2: I'm not sure what this would to to uses of "@" as an operator, as has been suggested various times for various laudable reasons; remember that an @decorator or other function definition is just another statement, and arbitrary expressions are already statements.
I don’t understand this. @ already exists as an operator, and already takes arbitrary expressions for the left and right operands, with no parser ambiguity. What future worthwhile suggestions to that existing syntax are you imagining that might break that?
None. I've not thought it through other than that suddenly arbitrary expressions can occur here where before they could not.
I guess so. My concerns here are looking specious. Cheers, Cameron Simpson <cs@cskk.id.au>

22.10.19 06:41, Andrew Barnert via Python-ideas пише:
2: I'm not sure what this would to to uses of "@" as an operator, as has been suggested various times for various laudable reasons; remember that an @decorator or other function definition is just another statement, and arbitrary expressions are already statements.
I don’t understand this. @ already exists as an operator, and already takes arbitrary expressions for the left and right operands, with no parser ambiguity. What future worthwhile suggestions to that existing syntax are you imagining that might break that?
There is a difference between @deco1@deco2 def func(): and @deco1 @deco2 def func(): But in some sense they look similar, and with more complex multiline expressions they can look indistinguishable.

I think some idea like this might be worth proposing. the first idea that comes to my mind is to allow the name of a decorator to be an fstring using `@'...'` or `@"..."` syntax. If, for example, you have `method_type = 'class'`, then you could decorate a method using `@'{method_type}method'`.

On Mon, Oct 21, 2019, at 22:00, Steve Jorgensen wrote:
I'm not sure if this is a very good example (there's no "normalmethod" to return no decorator, and staticmethod typically needs a different function signature with no self/cls)... and for any nontrivial case I can imagine, it's taken care of by the ability to call a function. For example, for your case you can simply def m(method_type): if method_type == 'static': return staticmethod elif method_type == 'class': return classmethod elif method_type == 'normal': return lambda f: f else: # do what here? are you extending with additional "foomethod" decorators? and then do @m(method_type).

On Mon, Oct 21, 2019, at 22:30, Random832 wrote:
Sorry, when posting this, I hadn't seen Yonatan Zunger's original post yet, only this reply. I do see the utility for that suggestion, but not really for this one allowing a decorator to be a string that will be evaluated. [and if you *really* want yours literally, you could simply do @eval(f'{method_type}method'), or something else in case method_type may contain characters that are not part of an identifier.

had not thought of that. That actually does work. :) I would say that means there is no need for a new feature, but would it make sense for this idiom to be documented in a PEP or some other easily discoverable place?

On Mon, Oct 21, 2019 at 8:21 PM Yonatan Zunger <zunger@humu.com> wrote:
Hey everyone,
I came across a case which might be a use case for a syntax extension, but I'm not sure. Wanted to get feedback from the group.
[...]
@CronJob('job-name', params...).override('dev', more-params...) def myFunction(...)
Could you modify the signature of CronJob's __init__ method to accept override parameters as a dict or list in a keyword argument? Then this example would look something like this: @CronJob('job-name', params..., dev=more-params...) def myFunction(...) Internally, you could even keep the .override() method and have __init__ call that method to do the work. The signature of __init__ could then either include an explicit keyword argument (defaulting to None) for each environment you need to account for, or if there are too many, a **kwargs catch-all from which you extract what you need.

On 21Oct2019 17:18, Yonatan Zunger <zunger@humu.com> wrote:
Personally, I have 2 concerns. 1: this introduces a lot of scope for arbitrary complexity after the @ versus the extremely readable "name(...)" form. 2: I'm not sure what this would to to uses of "@" as an operator, as has been suggested various times for various laudable reasons; remember that an @decorator or other function definition is just another statement, and arbitrary expressions are already statements.
Maybe a better form, already supported, might be this: @CronJob('job-name', params...) @cron_override('dev', more-params...) def the_function... i.e. decorate the decorated function. I would choose to write it like the above, even though the decorators run from inside to out, amounting to having @cron_override prepare an environment specific decoration and @CronJob just be the "default for an otherwise unspecified environment" decoration. No new syntax needed. And it reads nicely, at least to my eye. You will probably run into some resistance if there's no case for your syntax which can't be addressed with the nested decoration above (or something equivalent - the point here isn't that what I've written above is a great thing (though I like it) but that it requires no new syntax). Cheers, Cameron Simpson <cs@cskk.id.au>

On Oct 21, 2019, at 19:53, Cameron Simpson <cs@cskk.id.au> wrote:
I think this discussed when decorators were first proposed, and the consensus wasn’t “yuck” but rather “let’s see if we need it, and only worry about it when it comes up”. If so, that’s a small point in favor of your proposal, so you might want to search the original discussion.
Personally, I have 2 concerns. 1: this introduces a lot of scope for arbitrary complexity after the @ versus the extremely readable "name(...)" form.
This seems like a consenting-adults thing. Any flexible syntax can be abused to write unreadable code, but unless there’s a good reason to suspect it will be abused often (and/or won’t be used helpfully very often) that’s not much of an argument against it. I don’t think many people would write horrible arbitrary decorator expressions, except for the kind of people who already do horrible unpythonic things like writing a ten-line function as a lambda just to assign it to a variable anyway.
2: I'm not sure what this would to to uses of "@" as an operator, as has been suggested various times for various laudable reasons; remember that an @decorator or other function definition is just another statement, and arbitrary expressions are already statements.
I don’t understand this. @ already exists as an operator, and already takes arbitrary expressions for the left and right operands, with no parser ambiguity. What future worthwhile suggestions to that existing syntax are you imagining that might break that? Even if you’re imagining that people might want @ to be a unary prefix operator as well as a binary operator (like + and -), how does restricting decorator syntax help ambiguity there? Surely you’d want the @ operator to be able to take a dotted-name operand.

On 21Oct2019 20:41, Andrew Barnert <abarnert@yahoo.com> wrote:
Maybe so. I still have personal reluctance to open such a door without a good reason.
2: I'm not sure what this would to to uses of "@" as an operator, as has been suggested various times for various laudable reasons; remember that an @decorator or other function definition is just another statement, and arbitrary expressions are already statements.
I don’t understand this. @ already exists as an operator, and already takes arbitrary expressions for the left and right operands, with no parser ambiguity. What future worthwhile suggestions to that existing syntax are you imagining that might break that?
None. I've not thought it through other than that suddenly arbitrary expressions can occur here where before they could not.
I guess so. My concerns here are looking specious. Cheers, Cameron Simpson <cs@cskk.id.au>

22.10.19 06:41, Andrew Barnert via Python-ideas пише:
2: I'm not sure what this would to to uses of "@" as an operator, as has been suggested various times for various laudable reasons; remember that an @decorator or other function definition is just another statement, and arbitrary expressions are already statements.
I don’t understand this. @ already exists as an operator, and already takes arbitrary expressions for the left and right operands, with no parser ambiguity. What future worthwhile suggestions to that existing syntax are you imagining that might break that?
There is a difference between @deco1@deco2 def func(): and @deco1 @deco2 def func(): But in some sense they look similar, and with more complex multiline expressions they can look indistinguishable.
participants (7)
-
Andrew Barnert
-
Cameron Simpson
-
Jonathan Goble
-
Random832
-
Serhiy Storchaka
-
Steve Jorgensen
-
Yonatan Zunger