Syntax for defining parametric decorators
A common stumbling block for new users is writing decorators that take arguments. To create a decorator like @timesn(n) def f(y): ... We write code like def timesn(n) def decorator(f): def inner(y): return n * f(y) return inner return decorator which confuses many users and can be a handful to type. I wonder if it would be clearer for people to write def timesn(n)(f): def inner(y): return n * f(y) return inner which is more concise and looks a lot more like a non-parametric decorator someone might have written already. The syntax is mostly self-explaining and could potentially be useful in other contexts. There exist tools like the decorator library to try to simplify this already, but in my experience they mostly serve to confuse people using decorators for the first time more. One thing I didn't specify was whether `n` was nonlocal or not and the behavior of something that keeps and reuses timesn(some_specific_n) multiple times. Does anyone think a feature like this may be useful? Regards, Mike
On Sun, Jul 8, 2012 at 4:22 PM, Mike Graham <mikegraham@gmail.com> wrote:
which is more concise and looks a lot more like a non-parametric decorator someone might have written already. The syntax is mostly self-explaining and could potentially be useful in other contexts.
Ooh, +1 Semantically the function and the decorator-params are both "arguments" to the decorator, and this confuses a lot of people into writing silly things like def decorator(arg1, arg2, f): ... In actuality, these two are equivalent, and this new def syntax reflects that: @decorator(arg): def foo(...): ... def foo(...): ... foo = decorator(arg)(foo) As opposed to the usual definition syntax, which is not really intuitive.
One thing I didn't specify was whether `n` was nonlocal or not and the behavior of something that keeps and reuses timesn(some_specific_n) multiple times.
For the purposes of decorators I don't think it matters. I would guess that keeping and reusing the first arg is more useful for non-decorator purposes, and making them a nonlocal is more consistent with the rest of Python, if we reuse the argument. Plus it'd align better semantically with the old way of defining parametric decorators. -- Devin
On Sun, 8 Jul 2012 16:22:43 -0400 Mike Graham <mikegraham@gmail.com> wrote:
which confuses many users and can be a handful to type. I wonder if it would be clearer for people to write
def timesn(n)(f): def inner(y): return n * f(y) return inner
But then, why not: def timesn(n)(f)(y): return n * f(y) ? Regards Antoine. -- Software development and contracting: http://pro.pitrou.net
Mike Graham wrote:
def timesn(n)(f): def inner(y): return n * f(y) return inner
+1 from me on this. Scheme has an analogous feature, btw, which is very elegant. I'd love to see this in Python. As has been pointed out, you can go further and collapse this into def timesn(n)(f)(y): return n * f(y) This means it would also be useful for non-parametric decorators as well, e.g. def fivetimes(f)(x): return 5 * f(x) -- Greg
Step one is to stop referring to decorator factories as decorators. Beyond that, no, this is too limited - it only helps when there's no extra code in the outer scope which, in my experience, is the exception rather than the rule. Closures are an advanced programming concept - there are limits to how simple they are ever going to be. Cheers, Nick. -- Sent from my phone, thus the relative brevity :)
On Sun, Jul 8, 2012 at 6:54 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
Beyond that, no, this is too limited - it only helps when there's no extra code in the outer scope which, in my experience, is the exception rather than the rule.
I'm not sure I think that's the case. functools.wraps, functools.lru_cache, and reprlib.recursive_repr are the only stdlib decorator factories I could think of, and all of them could use this syntax. So could the ones I could quickly find in my own code. Mike
On Mon, Jul 9, 2012 at 11:11 AM, Mike Graham <mikegraham@gmail.com> wrote:
On Sun, Jul 8, 2012 at 6:54 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
Beyond that, no, this is too limited - it only helps when there's no extra code in the outer scope which, in my experience, is the exception rather than the rule.
I'm not sure I think that's the case.
- functools.wraps Not really, as it uses functools.partial, not a nested def (sure you could rewrite it to use a nested def instead, but why?)
functools.lru_cache
No, as it has pre- and post-processing steps around the nested function definition. - reprlib.recursive_repr Yes, this one would qualify. However, the simplest possible complete definition of decorators is as follows: - a decorator expression (the bit after "@") on a def statement must produce a function decorator - a decorator expression on a class statement must produce a class decorator - a function decorator accepts a function and produces another object. This is typically another function (or even the original function for annotation and registration decorators), but may also be a different kind of object such as a descriptor (e.g. property, classmethod, staticmethod) or context manager (e.g. contextlib.contextmanager) - a class decorator is similar, but accepts a class rather than a function - a decorator factory is any callable that returns a decorator - function decorators are often written as functions that return a nested function (1 level of lexical nesting) - function decorator factories are often written as functions that return a function that returns a functions (2 levels of lexical nesting) Creating a dedicated syntax for the special case of function decorator factories written to use lexical nesting makes the language as a whole *more* complicated rather than less. The only way to actually make them *simpler* would be to eliminate one or more of the bullet points from the above list, and that can't be done while retaining the current functionality. Yes, it means that to write efficient custom decorator factories you need to understand both the decoration process and lexical closures. That's *OK* - it just means writing custom decorator factories (as opposed to using those written by others) is a moderately advanced metaprogramming technique (it's still a lot simpler than writing a custom metaclass, though). Some concepts are just hard to get your head around, but that doesn't mean we should be creating dedicated syntax for a very specialised use case. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On Sun, Jul 8, 2012 at 9:46 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
However, the simplest possible complete definition of decorators is as follows:
- a decorator expression (the bit after "@") on a def statement must produce a function decorator - a decorator expression on a class statement must produce a class decorator - a function decorator accepts a function and produces another object. This is typically another function (or even the original function for annotation and registration decorators), but may also be a different kind of object such as a descriptor (e.g. property, classmethod, staticmethod) or context manager (e.g. contextlib.contextmanager) - a class decorator is similar, but accepts a class rather than a function - a decorator factory is any callable that returns a decorator - function decorators are often written as functions that return a nested function (1 level of lexical nesting) - function decorator factories are often written as functions that return a function that returns a functions (2 levels of lexical nesting)
Mmm, this distinction of "kinds of decorators" is silly. The only requirement for something to be a decorator is that it takes one argument. The type of the argument could be anything. def dumb_decorator(x): return x + 2 @dumb_decorator @apply def foo(): return 5
Creating a dedicated syntax for the special case of function decorator factories written to use lexical nesting makes the language as a whole *more* complicated rather than less. The only way to actually make them *simpler* would be to eliminate one or more of the bullet points from the above list, and that can't be done while retaining the current functionality.
Nothing about the OP's suggestion was specific to "function decorator factories". Observe a class decorator factory: def stupid_augmented_subclass_decorator(**kwargs)(cls): class MyClass(cls): pass MyClass.__dict__.update(**kwargs) return MyClass -- Devin
On Mon, Jul 9, 2012 at 11:55 AM, Devin Jeanpierre <jeanpierreda@gmail.com> wrote:
Nothing about the OP's suggestion was specific to "function decorator factories".
The sole motivating use case presented was "A common stumbling block for new users is writing decorators that take arguments." It's a terrible motivating use case for curried functions, because they only work for the most trivial of decorator factories. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
Just another point of data, Scala has exactly what is being proposed: multiple parameter lists as synctactic sugar for nested return functions http://stackoverflow.com/questions/4697404/scala-currying-by-nested-function... It also works with all the non-trivial edge-casey stuff python has (args, default/named args, varargs, kwargs, splats). My experience with this is that it actually works pretty well, the number of "function returning function" types which require pre- or post- processing is relatively small, and often this pre/post processing can simply be moved into the body of the function for no real cost. The cases where that cannot be done, it usually is because apart from simply returning a new function, I'm also enclosing over some mutable state in the outer function. In these cases I've found that this implicitly captured mutable state is better handled with an object-based decorator, making the stateful nature explicit. The lru_cache decorator seems like a good candidate for using explicit objects to hold the cache, rather cunningly enclosed mutable state. I agree that as a "decorator factory helper", having extra syntax is unwarranted. However, I think that in the general case, being able to define curried functions in one line, a.l.a. def func(a)(b)(c) = a * b * c rather than def func(a): def wrapper_two(b): def wrapper_one(c): return a * b * c return wrapper_one return wrapper_two would in fact be very nice. From the Zen of Python, I would say these are big wins for - Flat is better than nested - Readability Counts - Beautiful is better than ugly whether it's a win or lose for Explicit is better than Implicit is debatable: In the implementation domain, you're being less explicit about what's actually happening (function returning functions) but in the "how-to-use-me" domain you're being more explicit about what you actually want (call this function with multiple parameter lists). def f(a)(b)(c) looks much more similar to how you would use it: f(1)(2)(3) than that horrible triple-nested monstrosity above. -Haoyi On Sun, Jul 8, 2012 at 6:46 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
On Mon, Jul 9, 2012 at 11:11 AM, Mike Graham <mikegraham@gmail.com> wrote:
On Sun, Jul 8, 2012 at 6:54 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
Beyond that, no, this is too limited - it only helps when there's no extra code in the outer scope which, in my experience, is the exception rather than the rule.
I'm not sure I think that's the case.
- functools.wraps
Not really, as it uses functools.partial, not a nested def (sure you could rewrite it to use a nested def instead, but why?)
functools.lru_cache
No, as it has pre- and post-processing steps around the nested function definition.
- reprlib.recursive_repr
Yes, this one would qualify.
However, the simplest possible complete definition of decorators is as follows:
- a decorator expression (the bit after "@") on a def statement must produce a function decorator - a decorator expression on a class statement must produce a class decorator - a function decorator accepts a function and produces another object. This is typically another function (or even the original function for annotation and registration decorators), but may also be a different kind of object such as a descriptor (e.g. property, classmethod, staticmethod) or context manager (e.g. contextlib.contextmanager) - a class decorator is similar, but accepts a class rather than a function - a decorator factory is any callable that returns a decorator - function decorators are often written as functions that return a nested function (1 level of lexical nesting) - function decorator factories are often written as functions that return a function that returns a functions (2 levels of lexical nesting)
Creating a dedicated syntax for the special case of function decorator factories written to use lexical nesting makes the language as a whole *more* complicated rather than less. The only way to actually make them *simpler* would be to eliminate one or more of the bullet points from the above list, and that can't be done while retaining the current functionality.
Yes, it means that to write efficient custom decorator factories you need to understand both the decoration process and lexical closures. That's *OK* - it just means writing custom decorator factories (as opposed to using those written by others) is a moderately advanced metaprogramming technique (it's still a lot simpler than writing a custom metaclass, though).
Some concepts are just hard to get your head around, but that doesn't mean we should be creating dedicated syntax for a very specialised use case.
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Python-ideas mailing list Python-ideas@python.org http://mail.python.org/mailman/listinfo/python-ideas
Haoyi Li writes:
However, I think that in the general case, being able to define curried functions in one line, a.l.a. [...] def f(a)(b)(c)
looks much more similar to how you would use it:
f(1)(2)(3)
I don't understand. I wouldn't use it that way! I would think if that were common usage, and the curried form only occasional, def f(a,b,c): pass f(1,2,3) would be more natural in Python (and use functools.partial for the occasional currying). Presumably if you were using this syntax, you'd be doing something more like def f(a)(b)(c): pass def g(h, x): h(x) g(f(1)(2), 3) which doesn't look very nice to me, nor do I find the def to be a particularly intuitive way of reminding me that the proper usage is a call of a function in curried form (specifically, I would not be reminded of whether the expected curried form is a 1st-, 2nd-, or 3rd-order function). I also suppose that you wouldn't be able to do def f(a)(b)(c): pass def g(x,y): pass h(f(1)) h(g) (ie, without LBYL or EAFP constructs in h, which would simply be throwing the complexity into the caller's backyard). Even though def bar(a)(b): pass def baz(x): pass quux(bar(1)) quux(baz) would work fine. So to me, there may be something in this syntax, but my initial impression is that it's trying to turn Python into something it's not.
On 07/08/2012 10:22 PM, Mike Graham wrote:
A common stumbling block for new users is writing decorators that take arguments. To create a decorator like
@timesn(n) def f(y): ...
We write code like
def timesn(n) def decorator(f): def inner(y): return n * f(y) return inner return decorator
which confuses many users and can be a handful to type. I wonder if it would be clearer for people to write
def timesn(n)(f): def inner(y): return n * f(y) return inner
Why not write this?: def timesn(n)(f)(y): return n * f(y) This would be a currified function. One could implement something like that like this: def curry(f): for i in range(f.func_code.co_argcount-1): f = (lambda f: lambda *args: partial(f,*args))(f) return f @curry def timesn(n,f,y): return n * f(y)
which is more concise and looks a lot more like a non-parametric decorator someone might have written already. The syntax is mostly self-explaining and could potentially be useful in other contexts.
There exist tools like the decorator library to try to simplify this already, but in my experience they mostly serve to confuse people using decorators for the first time more.
One thing I didn't specify was whether `n` was nonlocal or not and the behavior of something that keeps and reuses timesn(some_specific_n) multiple times.
Does anyone think a feature like this may be useful?
Regards, Mike
On Sun, Jul 8, 2012 at 7:13 PM, Mathias Panzenböck <grosser.meister.morti@gmx.net> wrote:
This would be a currified function. One could implement something like that like this:
def curry(f): for i in range(f.func_code.co_argcount-1): f = (lambda f: lambda *args: partial(f,*args))(f) return f
@curry def timesn(n,f,y): return n * f(y)
But then how do you do: def linearcomposition(m, c=0)(f)(x): return m * f(x) + c Python isn't Haskell, so we can't really currify every argument and expect things to work out, esp. because of variadic functions (defaults or even *args and **kwargs). -- Devin
Mike Graham wrote:
A common stumbling block for new users is writing decorators that take arguments. To create a decorator like
I question that assertion. Have you spent much time on the tutor@python.org mailing list? I have, and I can say that decorators is not something that many new users there are concerned about *at all*, let alone decorators which take arguments. I believe that decorator factories, and closures in general, are a moderately advanced technique, and not a stumbling block for most new users. I can't say I remember the last time I've seen anyone ask for help writing decorator factories on either the tutor or python-list mailing list. So I don't believe that making new syntax is neither needed or desirable. -1 on the idea. Some more issues/problems with the idea:
We write code like
def timesn(n) def decorator(f): def inner(y): return n * f(y) return inner return decorator
In my experience, most decorator factories (or factory-factories in general, not just for generators but any time you need to wrap a factory function in a closure) require at least one pre-processing step, and occasionally one or two post-processing steps as well: def timesn(n) pre_process() def decorator(f): pre_process() @functools.wraps(f) def inner(y): return n * f(y) post_process() return inner post_process() return decorator With your suggested syntax, you lose a scope, and it is unclear to me what this would do: def timesn(n)(f): pre_process() @functools.wraps(f) def inner(f): return n * f(y) post_process() return inner
which confuses many users and can be a handful to type. I wonder if it would be clearer for people to write
def timesn(n)(f): def inner(y): return n * f(y) return inner
which is more concise and looks a lot more like a non-parametric decorator someone might have written already. The syntax is mostly self-explaining and could potentially be useful in other contexts.
Concise, yes, but I disagree that it is self-explaining. I'm having a lot of difficulty in thinking of how I would explain it to even a moderately experienced user except by expanding it out to the explicit nested function form. I'm not sure I can even explain it to myself. What will happen when the user invariably writes something like this? def ordinary(a)(b): return a + b My guess is that they will get a syntax error, but it will be a syntax error in the wrong place: instead of clearly flagging the error (a)(b) as invalid syntax, as you get now py> def ordinary(a)(b): File "<stdin>", line 1 def ordinary(a)(b): ^ SyntaxError: invalid syntax Instead, the user will get a less useful syntax error on the following line, where the compiler sees that the function def is not followed immediately by another function def. py> def ordinary(a)(b): ... return a + b File "<stdin>", line 1 return a + b ^ SyntaxError: invalid syntax So whatever clarity you (allegedly) gain when writing decorator factories, you lose when making an error.
There exist tools like the decorator library to try to simplify this already, but in my experience they mostly serve to confuse people using decorators for the first time more.
In my opinion, that's because the decorator and decorator factory concept are already as simple as they can possibly be. Anything you do to them adds complexity, not reduces it. I mean, it's a simple concept: a decorator is a kind of function which returns a function. If you put that inside a third function, you have a function which returns a function which returns a function. Any syntax to disguise that simplicity is only going to make things more complex, not less. I haven't given any thought to how this will effect class decorators, but I suspect it will run into the same problems. -- Steven
On Sun, Jul 8, 2012 at 8:52 PM, Steven D'Aprano <steve@pearwood.info> wrote:
Mike Graham wrote:
A common stumbling block for new users is writing decorators that take arguments. To create a decorator like
I question that assertion. Have you spent much time on the tutor@python.org mailing list? I have, and I can say that decorators is not something that many new users there are concerned about *at all*, let alone decorators which take arguments.
I haven't, but I have spend considerable time helping people learn Python. I'm a gold-badge Python answerer on Stack Overflow, an op/regular on the #python IRC channel on freenode, and have taught people in the meatosphere. Decorators is certainly a topic that interests some people as they're still getting to know Python, and my most recent Stack Overflow question was from someone confused about some code using decorators and as recently as this morning I saw someone on #python say they "wanted to learn about decorators" in the abstract. In any event, I didn't mean _people completely brand new to Python_ when I talked about people who were new.
In my experience, most decorator factories (or factory-factories in general, not just for generators but any time you need to wrap a factory function in a closure) require at least one pre-processing step, and occasionally one or two post-processing steps as well:
In surveying actual decorator factories in the stdlib and my own code, I didn't notice it to be the case there was any preprocessing inside the outermost function. Maybe I just found the wrong ones.
Concise, yes, but I disagree that it is self-explaining.
I'm having a lot of difficulty in thinking of how I would explain it to even a moderately experienced user except by expanding it out to the explicit nested function form. I'm not sure I can even explain it to myself.
What will happen when the user invariably writes something like this?
def ordinary(a)(b): return a + b
My guess is that they will get a syntax error, but it will be a syntax error in the wrong place: instead of clearly flagging the error (a)(b) as invalid syntax, as you get now
py> def ordinary(a)(b): File "<stdin>", line 1 def ordinary(a)(b): ^ SyntaxError: invalid syntax
Instead, the user will get a less useful syntax error on the following line, where the compiler sees that the function def is not followed immediately by another function def.
py> def ordinary(a)(b): ... return a + b File "<stdin>", line 1 return a + b ^ SyntaxError: invalid syntax
So whatever clarity you (allegedly) gain when writing decorator factories, you lose when making an error.
I'm glad to get a different perspective here. It was evidently much more self-explanatory to the folks I discussed this with before posting. I wouldn't have expected this to require the def to be immediately followed by another def or for that to be the only (or most-common) case. I would expect that to be proper syntax under this idea, and for ordinary(3)(4) to be 7. I don't know if I said anything to suggest that the syntax should be quite so specialcased, but I hope not. Mike
On Sun, Jul 8, 2012 at 8:52 PM, Steven D'Aprano <steve@pearwood.info> wrote:
Concise, yes, but I disagree that it is self-explaining.
I think the idea is that just as def foo(a): ... can be called as foo(x) to execute the body, def foo(a)(b): ... can be called as foo(x)(y) to execute the body.
I'm having a lot of difficulty in thinking of how I would explain it to even a moderately experienced user except by expanding it out to the explicit nested function form. I'm not sure I can even explain it to myself.
What's wrong with explaining it by expansion? Also, I doubt it's as hard as you claim. Something similar -- curried functions -- have been a staple of functional programming languages since before Python ever existed. This should also help explain it to lots of new users with wider backgrounds.
What will happen when the user invariably writes something like this?
def ordinary(a)(b): return a + b
Surely it'd define a curried function s.t. ordinary(1)(2) == 3? -- Devin
On Mon, Jul 9, 2012 at 11:45 AM, Devin Jeanpierre <jeanpierreda@gmail.com> wrote:
I'm having a lot of difficulty in thinking of how I would explain it to even a moderately experienced user except by expanding it out to the explicit nested function form. I'm not sure I can even explain it to myself.
What's wrong with explaining it by expansion?
Also, I doubt it's as hard as you claim. Something similar -- curried functions -- have been a staple of functional programming languages since before Python ever existed. This should also help explain it to lots of new users with wider backgrounds.
OK, I think it would be *much* better to approach the problem from that angle. def incremental(x)(f)(y): return x + f(y) As equivalent to: # implied names are not legal identifiers - the compiler gets to do that # because of its privileged role in naming things def incremental(x): def <incremental:1>(f): def <incremental:2>(y): return x +f(y) return <incremental:1> I'm still not convinced of the general applicability (since it breaks down as soon as you want to decorate or otherwise pre- or post-process any of the steps, thus forcing people to learn the full "callable-returning-a-callable" idiom anyway), but it's probably worth writing up as a PEP in the 3.4 timeframe. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
Mike Graham, 08.07.2012 22:22:
A common stumbling block for new users is writing decorators that take arguments. To create a decorator like
@timesn(n) def f(y): ...
We write code like
def timesn(n) def decorator(f): def inner(y): return n * f(y) return inner return decorator
which confuses many users and can be a handful to type. I wonder if it would be clearer for people to write
def timesn(n)(f): def inner(y): return n * f(y) return inner
which is more concise and looks a lot more like a non-parametric decorator someone might have written already. The syntax is mostly self-explaining and could potentially be useful in other contexts.
From an innocent look, I have no idea what the syntax is supposed to mean. Clearly doesn't hint at a factory for me.
Stefan
On Mon, 09 Jul 2012 11:47:48 +0200 Stefan Behnel <stefan_ml@behnel.de> wrote:
Mike Graham, 08.07.2012 22:22:
def timesn(n)(f): def inner(y): return n * f(y) return inner
which is more concise and looks a lot more like a non-parametric decorator someone might have written already. The syntax is mostly self-explaining and could potentially be useful in other contexts.
From an innocent look, I have no idea what the syntax is supposed to mean. Clearly doesn't hint at a factory for me.
Does to me. timesn is clearly a function of one argument that is going to return a callable object that takes one argument. So: x5 = times(5) print x5(7) should print 35. I'd be interested in seeing an implementation of partial that used this mechanism. Should be straightforward, but I'm in the midst of a crunch. <mike -- Mike Meyer <mwm@mired.org> http://www.mired.org/ Independent Software developer/SCM consultant, email for more information. O< ascii ribbon campaign - stop html mail - www.asciiribbon.org
On Mon, Jul 9, 2012 at 7:47 PM, Stefan Behnel <stefan_ml@behnel.de> wrote:
From an innocent look, I have no idea what the syntax is supposed to mean. Clearly doesn't hint at a factory for me.
I should also mention that I have a different proposal that affects the way one would write functions-that-returns-functions. I've been messing around with the idea of statement local namespaces for years (see PEP 3150) trying to find something that I consider better than the status quo, and PEP 403's statement local function and class definitions (http://www.python.org/dev/peps/pep-0403/) are the current incarnation. With those, the number of statements in a simple wrapping decorator factory doesn't change, but the return statements can be moved above their respective function definitions: def notify_on_call(callback, *cb_args, **cb_kwds): in return decorator def decorator(f): in return wrapped @functools.wraps(f) def wrapped(*args, **kwds): callback(*cb_args, cb_kwds) return f(*args, **kwds) Rather than the current out-of-order: def notify_on_call(callback, *cb_args, **cb_kwds): def decorator(f): @functools.wraps(f) def wrapped(*args, **kwds): callback(*cb_args, cb_kwds) return f(*args, **kwds) return wrapped return decorator (Note: I haven't updated the PEP in a while, so it currently still disallows combining the in statement with decorators - I think that's a mistake, and will be updating it some time post 3.3) Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
Nick Coghlan wrote:
On Mon, Jul 9, 2012 at 7:47 PM, Stefan Behnel <stefan_ml@behnel.de> wrote:
From an innocent look, I have no idea what the syntax is supposed to mean. Clearly doesn't hint at a factory for me.
I should also mention that I have a different proposal that affects the way one would write functions-that-returns-functions. I've been messing around with the idea of statement local namespaces for years (see PEP 3150) trying to find something that I consider better than the status quo, and PEP 403's statement local function and class definitions (http://www.python.org/dev/peps/pep-0403/) are the current incarnation.
With those, the number of statements in a simple wrapping decorator factory doesn't change, but the return statements can be moved above their respective function definitions:
def notify_on_call(callback, *cb_args, **cb_kwds): in return decorator def decorator(f): in return wrapped @functools.wraps(f) def wrapped(*args, **kwds): callback(*cb_args, cb_kwds) return f(*args, **kwds)
I *really* don't like the way that the "in return" statement reads like it creates a new block, but the following lines are not indented. If I were writing this as pseudo-code for a human reader, I would surely indent the following lines. Even if the part following the "in return" were limited to a single expression, I would prefer to indent it if it appears on another line. Using lambda as an example: # Best lambda x: expression # Acceptable lambda x:\ expression # Unacceptable lambda x:\ expression Looking back at "in return", here's a simple example which doesn't use nested functions. Compare: def function(arg): in return value value = process(arg) print("Value is", value) versus: def function(arg, verbose=False): in return value: value = process(arg) print("Value is", value) The lack of indentation (and trailing colon) makes the first extremely unpythonic -- everything else that creates a new block is indented. Consequently I hate the first one and am merely cold to the second.
Rather than the current out-of-order:
def notify_on_call(callback, *cb_args, **cb_kwds): def decorator(f): @functools.wraps(f) def wrapped(*args, **kwds): callback(*cb_args, cb_kwds) return f(*args, **kwds) return wrapped return decorator
I would not describe that as "out-of-order". Seems to me that it is precisely in order: first you create the object (a function), then you return it. You can't return something before it exists. It seems to me that this proposal, and the older PEP 3150, are the ones which are out-of-order: you use things before they are defined. For what it's worth, I have slightly warmed to PEP 3150 and would give it a very tentative +0.125: def notify_on_call(callback, *cb_args, **cb_kwds): return decorator given: def decorator(f): @functools.wraps(f) def wrapped(*args, **kwds): callback(*cb_args, cb_kwds) return f(*args, **kwds) return wrapped I don't think the above is any improvement at all on the status quo, but 3150 would allow you to write maths expressions more mathematically: def func(x): return a**2 + 3*a given: a = 1/(sin(x*pi)) + (cos(x)-1)/2 which I'm not sure will help people used to reading code, but it should at least be familiar territory to mathematicians and maths geeks. (Aside: I actually prefer that bikeshed to be called "where" rather than "given".) -- Steven
On Thu, Jul 12, 2012 at 11:37 AM, Steven D'Aprano <steve@pearwood.info> wrote:
Looking back at "in return", here's a simple example which doesn't use nested functions. Compare:
def function(arg): in return value value = process(arg) print("Value is", value)
Like a decorator line, the "in" clause is effectively part of the subsequent function/class definition - it's not a separate statement. That's why there's no ending colon. However, I'm now thinking a leading @ might be appropriate in order to make the decorator parallel crystal clear.
I would not describe that as "out-of-order". Seems to me that it is precisely in order: first you create the object (a function), then you return it. You can't return something before it exists.
No, that's implementation order - it's not "pattern of thought" order. It's why English permits phrase like "this statement is true, given that this particular assumption is true". You can forward reference incidental details and fill them in later. Sorting is generally a better example: x = sorted(container) OK, this is sorted smallest to largest. Now, I want to sort according to the second item. I can do this out of order by introducing a key function before it's clear why I need it: def _key(v): return v[1] x = sorted(container, key=_key) Or, we can use the awkward itemgetter construct to get things back in the right order and restore the assignment and sorting operation to it's rightful place of prominence: from operator import itemgetter x = sorted(container, key=itemgetter(1)) Using a lambda expression achieves the same goal: x = sorted(container, key=lambda v: v[1]) Those are both ugly though, and are restricted to a single expression. More complicated sorting logic is forced back into using the out of order form. PEP 403 is designed to let you pull the sort key out into a trailing full-fledged function definition: @in x = sorted(container, key=f) def f(v): return v[1] The important part here is that the container is being sorted and the result assigned to 'x' - the exact sort key used isn't important to the overall flow of the algorithm.
It seems to me that this proposal, and the older PEP 3150, are the ones which are out-of-order: you use things before they are defined.
Yes, both PEP 403 and 3150 are out of order from a code *execution* point of view - but they're designed to match *patterns of thought* that work that way. It's especially prevalent in callback based programming - you end up with a lot of one-shot functions that aren't actually all that important in their own right, what really matters is the call you're handing them over to. Sometimes you're only defining a separate function because that's the only syntax Python has for passing a piece of code to a function, even though the code is only used once. Decorators eliminated this problem for the "f = deco(f)" and "C = deco(C)" cases by baking the ability to make such calls into function and class definitions. PEP 403 is solely about extending that same capability to arbitrary simple statements.
For what it's worth, I have slightly warmed to PEP 3150 and would give it a very tentative +0.125:
PEP 3150 died mainly due to "If the implementation is hard to explain, it's a bad idea", but also due to the fact that you regularly ended up with double-indents for callback based code. PEP 403 only has to a support a statement local reference to a single name, which is comparatively easy.
(Aside: I actually prefer that bikeshed to be called "where" rather than "given".)
PEP 3150 has an entire section devoted to the reasons "where" isn't a viable choice (not that that PEP stands any realistic chance of ever being resurrected). Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On Thu, Jul 12, 2012 at 6:49 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
On Thu, Jul 12, 2012 at 11:37 AM, Steven D'Aprano <steve@pearwood.info> wrote:
Looking back at "in return", here's a simple example which doesn't use nested functions. Compare:
def function(arg): in return value value = process(arg) print("Value is", value)
Like a decorator line, the "in" clause is effectively part of the subsequent function/class definition - it's not a separate statement. That's why there's no ending colon. However, I'm now thinking a leading @ might be appropriate in order to make the decorator parallel crystal clear.
I would not describe that as "out-of-order". Seems to me that it is precisely in order: first you create the object (a function), then you return it. You can't return something before it exists.
No, that's implementation order - it's not "pattern of thought" order. It's why English permits phrase like "this statement is true, given that this particular assumption is true". You can forward reference incidental details and fill them in later.
Sorting is generally a better example:
x = sorted(container)
OK, this is sorted smallest to largest.
Now, I want to sort according to the second item. I can do this out of order by introducing a key function before it's clear why I need it:
def _key(v): return v[1] x = sorted(container, key=_key)
Or, we can use the awkward itemgetter construct to get things back in the right order and restore the assignment and sorting operation to it's rightful place of prominence:
from operator import itemgetter x = sorted(container, key=itemgetter(1))
Using a lambda expression achieves the same goal:
x = sorted(container, key=lambda v: v[1])
Those are both ugly though, and are restricted to a single expression. More complicated sorting logic is forced back into using the out of order form.
PEP 403 is designed to let you pull the sort key out into a trailing full-fledged function definition:
@in x = sorted(container, key=f) def f(v): return v[1]
The important part here is that the container is being sorted and the result assigned to 'x' - the exact sort key used isn't important to the overall flow of the algorithm.
It seems to me that this proposal, and the older PEP 3150, are the ones which are out-of-order: you use things before they are defined.
Yes, both PEP 403 and 3150 are out of order from a code *execution* point of view - but they're designed to match *patterns of thought* that work that way. It's especially prevalent in callback based programming - you end up with a lot of one-shot functions that aren't actually all that important in their own right, what really matters is the call you're handing them over to.
Sometimes you're only defining a separate function because that's the only syntax Python has for passing a piece of code to a function, even though the code is only used once.
Decorators eliminated this problem for the "f = deco(f)" and "C = deco(C)" cases by baking the ability to make such calls into function and class definitions. PEP 403 is solely about extending that same capability to arbitrary simple statements.
For what it's worth, I have slightly warmed to PEP 3150 and would give it a very tentative +0.125:
PEP 3150 died mainly due to "If the implementation is hard to explain, it's a bad idea", but also due to the fact that you regularly ended up with double-indents for callback based code. PEP 403 only has to a support a statement local reference to a single name, which is comparatively easy.
(Aside: I actually prefer that bikeshed to be called "where" rather than "given".)
PEP 3150 has an entire section devoted to the reasons "where" isn't a viable choice (not that that PEP stands any realistic chance of ever being resurrected).
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Python-ideas mailing list Python-ideas@python.org http://mail.python.org/mailman/listinfo/python-ideas
The way I understand this proposal is similar to how MATLAB defines functions where the names of the variables that are returned are in/near the function declaration. It's useful on the occasions where you decide another variable is needed by the caller. Also, it's nice to have consistency in whatever's returned. The "return" keyword would be strange though. There are a few unobvious edge cases here. So I'd have to say -0.1 Yuval
Nick Coghlan writes:
On Thu, Jul 12, 2012 at 11:37 AM, Steven D'Aprano <steve@pearwood.info> wrote:
Looking back at "in return", here's a simple example which doesn't use nested functions. Compare:
def function(arg): in return value value = process(arg) print("Value is", value)
Like a decorator line, the "in" clause is effectively part of the subsequent function/class definition - it's not a separate statement. That's why there's no ending colon. However, I'm now thinking a leading @ might be appropriate in order to make the decorator parallel crystal clear.
Indeed I would prefer the leading "@", because the in clause (syntactically) "decorates" the following definition. But isn't Steven's example just plain invalid on two counts? That is, in the PEP (1) the "in" modifies a function or class definition, not an arbitrary statement, and (2) it does not have suite scope, it has statement scope. Another way of expressing (2) is that the print statement above is dead code. With respect to the idea itself, I'm a little concerned that in the case of the decorator, the idea is to deemphasize the decoration in favor of focusing on the decorated definition, while in the case of the "in" clause, the idea is to deemphasize the definition in favor of focusing on the "decorating" statement. Perhaps in practice this won't be a problem, though. For the "in" clause, the order of presentation will provide the emphasis on the "decorating" statement. OTOH, for the decorators, the order of presentation is no big deal compared to the avoidance of DRY violation.
On Thu, Jul 12, 2012 at 7:32 PM, Stephen J. Turnbull <stephen@xemacs.org> wrote:
With respect to the idea itself, I'm a little concerned that in the case of the decorator, the idea is to deemphasize the decoration in favor of focusing on the decorated definition, while in the case of the "in" clause, the idea is to deemphasize the definition in favor of focusing on the "decorating" statement.
There are many reasons PEP 403 is something I pick up and improve as the urge strikes me rather than something I treat with any urgency :) Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On Sun, Jul 8, 2012 at 4:22 PM, Mike Graham <mikegraham@gmail.com> wrote:
A common stumbling block for new users is writing decorators that take arguments. To create a decorator like
@timesn(n) def f(y): ...
We write code like
def timesn(n) def decorator(f): def inner(y): return n * f(y) return inner return decorator
which confuses many users and can be a handful to type. I wonder if it would be clearer for people to write
def timesn(n)(f): def inner(y): return n * f(y) return inner
which is more concise and looks a lot more like a non-parametric decorator someone might have written already. The syntax is mostly self-explaining and could potentially be useful in other contexts.
There exist tools like the decorator library to try to simplify this already, but in my experience they mostly serve to confuse people using decorators for the first time more.
One thing I didn't specify was whether `n` was nonlocal or not and the behavior of something that keeps and reuses timesn(some_specific_n) multiple times.
Does anyone think a feature like this may be useful?
While I'm a bit taken aback by the syntax, I recognize this is mostly because I'm not used to it. Objectively, it makes enough sense that I think it fits. I think even if the only use case is decorator factories, they are a useful enough thing that this is okay. We have a specific syntax to support using decorators, why not better support writing them? Still, I'd like to see any ideas people have for alternative uses for this syntax.
Regards, Mike _______________________________________________ Python-ideas mailing list Python-ideas@python.org http://mail.python.org/mailman/listinfo/python-ideas
-- Read my blog! I depend on your acceptance of my opinion! I am interesting! http://techblog.ironfroggy.com/ Follow me if you're into that sort of thing: http://www.twitter.com/ironfroggy
On 9 July 2012 09:28, Calvin Spealman <ironfroggy@gmail.com> wrote:
On Sun, Jul 8, 2012 at 4:22 PM, Mike Graham <mikegraham@gmail.com> wrote:
A common stumbling block for new users is writing decorators that take arguments. To create a decorator like
@timesn(n) def f(y): ...
-1 for the new syntax, since a decorator for trivial decorator factories (rather, parameterized decorators) can be written in a total of 5 lines: from functools import partial def parameterized(decorator): def part_decorator(*args, **kw): return partial(decorator, *args, **kw) return part_decorator
@parameterized ... def times(n, func): ... def new_func(*args, **kw): ... return n * func(*args, **kw) ... return new_func ...
@times(3) ... def add(x ,y): ... return x + y ... add(1,1) 6
For more complex cases, requiring pre processing, post processing, and so on, the normal syntax can cut it. And, certainly, a more complex such decorator could be written, so that it accepts some arguments itself. (I would be +1 for such a decorator in functools) js -><-
On 7/8/2012 4:22 PM, Mike Graham wrote:
A common stumbling block for new users is writing decorators that take arguments. To create a decorator like
@timesn(n) def f(y): ...
We write code like
def timesn(n) def decorator(f): def inner(y): return n * f(y) return inner return decorator
which confuses many users and can be a handful to type. I wonder if it would be clearer for people to write
def timesn(n)(f): def inner(y): return n * f(y) return inner
which is more concise and looks a lot more like a non-parametric decorator someone might have written already. The syntax is mostly self-explaining and could potentially be useful in other contexts.
As others have noted, this is a proposal for introducing optional automatic function currying to python. Parametric decorators are an obvious use case, because they require double function nesting, but neither the interpreter not the function, such as timesn, would know or care how the function will be used. If you are really concerned specifically about decorator expressions, I think someone suggestion of a new decorator function in functools would be the better way to go. For people 'raised' with imperative languages without nested functions and currying, function currying is definately *not* 'self-explaining'. The difficultly is conceptual, not syntactical. I am 99% sure it would result in more confusion, not less.
There exist tools like the decorator library to try to simplify this already, but in my experience they mostly serve to confuse people using decorators for the first time more.
One thing I didn't specify was whether `n` was nonlocal or not and the behavior of something that keeps and reuses timesn(some_specific_n) multiple times.
Does anyone think a feature like this may be useful?
Just about every proposal would be 'useful' to someone in some situations. Is it useful enough to justify the costs? My brief answer is that the syntactic sugar of automatic currying only applies to a few situations and does not add any new functionality. It would be another thing to learn, which most people would not get until they learned the more general explicit nesting. For some people, say from always-curried languages, it might be an attractive nuisance in that they would use it and incur the added overhead when it is pointless. -- Terry Jan Reedy
On Jul 11, 2012, at 7:19 PM, Terry Reedy wrote:
For people 'raised' with imperative languages without nested functions and currying, function currying is definately *not* 'self-explaining'. The difficultly is conceptual, not syntactical. I am 99% sure it would result in more confusion, not less.
FWIW, I've taught many Python classes this year and have found the current style easy to teach. I agree with Terry's assessment that introducing new syntactic trickery would cause more problems than it would solve. I also agree with Nick's sentiment that we would be better-off with a vocabulary that clearly distinguishes between between decorators and decorator factories. Raymond
On Wed, Jul 11, 2012 at 10:19:01PM -0400, Terry Reedy wrote:
For people 'raised' with imperative languages without nested functions and currying, function currying is definately *not* 'self-explaining'. The difficultly is conceptual, not syntactical. I am 99% sure it would result in more confusion, not less.
I was raised on Pascal, Fortran and Hypertalk, with some Forth-like languages (Forth and HP's RPL). None of which are functional, and I had never even heard of currying until *long* after starting with Python. But the idea of a function creating and returning a function was easy to understand once it was suggested to me. Once you have the right language to talk about a concept, some pretty complex concepts become simple. I suspect that most of the difficulty people have is because they don't clearly distinguish between a decorator and a decorator-factory. Once you make the conceptual leap to a function making a function (a decorator, which is a kind of function-factory), it is easy to think about a decorator-factory, a decorator-factory-factory, and so on. But if you don't cleanly distinguish them, in my opinion it becomes confusing and unclear whether you need one inner function or two. Of course, the book-keeping needed to make this all work is not necessarily trivial, but it's just book-keeping. New syntax won't make that easier. Singling out decorator-factories for special treatment is, in my opinion, a no-go. If this idea is to be sold, decorator-factories may be given as just one use-case for syntax for currying. But frankly, I don't think we need it, or should want it. Python is not Haskell, and while it's great to borrow ideas from Haskell (e.g. list comprehensions), it is notable that Python hasn't used the exact same syntax as Haskell. Compare: [toUpper c | c <- s] -- Haskell [c.upper() for c in s] # Python and tell me that "for c in s" isn't more readable than "| c <- s". I think syntax for currying def func(a)(b)(c)(d): ... f = func(1)(2)(3) f(4) is too Haskell-ish for my taste. I prefer the Python way: def func(a, b, c, d): ... f = functools.partial(func, 1, 2, 3) f(4) even though it is less concise. I prefer a stronger tool set in functools, including a way to use partial on arguments from the right. Perhaps there is a clean API for simplifying decorator factories. That could start as a published recipe on (say) ActiveState, and then move to functools if there was demand for it. But as usual, the bar to justify new syntax is rightly set much higher than the bar to justify new functionality. -- Steven
On Thu, Jul 12, 2012 at 5:05 PM, Steven D'Aprano <steve@pearwood.info> wrote:
I prefer a stronger tool set in functools, including a way to use partial on arguments from the right.
FWIW, my hope is that the provision of PEP 362 in Python 3.3 will encourage third parties to explore some richer options for manipulating callables now that it's much easier to get a clear definition of their signature. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
participants (16)
-
Antoine Pitrou
-
Calvin Spealman
-
Devin Jeanpierre
-
Greg Ewing
-
Haoyi Li
-
Joao S. O. Bueno
-
Mathias Panzenböck
-
Mike Graham
-
Mike Meyer
-
Nick Coghlan
-
Raymond Hettinger
-
Stefan Behnel
-
Stephen J. Turnbull
-
Steven D'Aprano
-
Terry Reedy
-
Yuval Greenfield