Paul Moore wrote:
Personally, I'd rather see partial as it stands, with its current limitations, included. The alternative seems to be a potentially long discussion, petering out without conclusion, and the whole thing missing Python 2.5. (I know that's a long way off, but this already happened with 2.4...)
-0 My preference is that it not go in as-is. It is better to teach how to write a closure than to introduce a new construct that has its own problems and doesn't provide a real improvement over what we have now. Despite my enthusiasm for functional programming and the ideas in PEP, I find the PFA implementation annoying. I had tried out the implementation that was being pushed for Py2.4 and found it wanting. Hopefully, it has improved since then. Here are a few thoughts based on trying to apply it to my existing code. * The PFA implementation proposed for Py2.4 ran slower than an equivalent closure. If the latest implementation offers better performance, then that may be a reason for having it around. * Having PFA only handle the first argument was a PITA with Python. In Haskell and ML, function signatures seems to have been designed with argument ordering better suited for left curries. In contrast, Python functions seem to have adverse argument ordering where the variables you want to freeze appear toward the right. This is subjective and may just reflect the way I was aspiring to use "partial" to freeze options and flags rather than the first argument of a binary operator. Still, I found closures to be more flexible in that they could handle any argument pattern and could freeze more than one variable or keyword at a time. * The instance method limitation never came up for me. However, it bites to have a tool working in a way that doesn't match your mental model. We have to document the limitations, keep them in mind while programming, and hope to remember them as possible causes if bugs ever arise. It would be great if these limitations could get ironed out. * Using the word "partial" instead of "lambda" traded one bit of unreadability for another. The "lambda" form was often better because it didn't abstract away its mechanism and because it supported more general expressions. I found that desk-checking code was harder because I had to mentally undo the abstraction to check that the argument signature was correct. * It is not clear that the proposed implementation achieves one of the principal benefits laid out in the PEP: "I agree that lambda is usually good enough, just not always. And I want the possibility of useful introspection and subclassing." If we get a better implementation, it would be nice if the PEP were updated with better examples. The TkInter example is weak because we often want to set multiple defaults at the same time (foreground, background, textsize, etc) and often those values are config options rather than hardwired constants. Raymond
On Sat, 26 Feb 2005 13:20:46 -0500, Raymond Hettinger
It is better to teach how to write a closure than to introduce a new construct that has its own problems and doesn't provide a real improvement over what we have now.
You make some good points. But this all reminds me of the discussion over itemgetter/attrgetter. They also special-case particular uses of lambda, and in those cases the stated benefits were speed and (arguably) readability (I still dislike the names, personally). I think partial hits a similar spot - it covers a fair number of common cases, and the C implementation is quoted as providing a speed advantage over lambda. On the minus side, I'm not sure it covers as many uses as {item,attr}getter, but on the plus side, I like the name better :-) Seriously, not needing to explicitly handle *args and **kw is a genuine readability benefit of partial. Of course, optimising Python function calls, and optimising lambda to death, would remove the need for any of these discussions. But there's no real indication that this is likely in the short term... This got me thinking, so I did a quick experiment:
python -m timeit -s "from operator import itemgetter; l=range(8)" "itemgetter(1)(l)" 1000000 loops, best of 3: 0.548 usec per loop
python -m timeit -s "l=range(8)" "(lambda x:x[1])(l)" 1000000 loops, best of 3: 0.597 usec per loop
That's far less of a difference than I expected from itemgetter! The quoted speed improvement in the C implementation of partial is far better... So I got worried, and tried a similar experiment with the C implementation of the functional module:
python -m timeit -s "import t" "t.partial(t.f, 1, 2, 3, a=4, b=5)(6, 7, 8, c=9, d=10)" 100000 loops, best of 3: 3.91 usec per loop
python -m timeit -s "import t" "(lambda *args, **kw: t.f(1, 2, 3, a=4, b=5, *args, **kw))(6, 7, 8, c=9, d=10)" 100000 loops, best of 3: 3.6 usec per loop
[Here, t is just a helper which imports partial, and defines f as def f(*args, **kw): return (args, kw)] Now I wonder. Are my tests invalid, did lambda get faster, or is the "lambda is slow" argument a myth? Hmm, I'm starting to go round in circles here. I'll post this as it stands, with apologies if it's incoherent. Blame it on a stinking cold :-( Paul.
But this all reminds me of the discussion over itemgetter/attrgetter. They also special-case particular uses of lambda, and in those cases the stated benefits were speed and (arguably) readability (I still dislike the names, personally).
I wouldn't use those as justification for partial(). The names suck and the speed-up is small. They were directed at a specific and recurring use case related to key= arguments.
I think partial hits a similar spot - it covers a fair number of common cases,
Are you sure about that? Contriving examples is easy, but download a few modules, scan them for use cases, and you may find, as I did, that partial() rarely applies. The argument order tends to be problematic. Grepping through the standard library yields no favorable examples. In inspect.py, you could replace "formatvarkw=lambda name: '**' + name" with "partial(operator.add, '**') but that would not be an improvement. Looking through the builtin functions also provides a clue: cmp(x,y) # partial(cmp, refobject) may be useful. coerce(x,y) # not suitable for partial(). divmod(x,y) # we would want a right curry. filter(p,s) # partial(filter, p) might be useful. getattr(o,n,d) # we would want a right curry. hasattr(o,n) # we would want a right curry. int(x,b) # we would want a right curry. isinstance(o,c) # we would want a right curry. issubclass(a,b) # we would want a right curry. iter(o,s) # we would want a right curry. long(x,b) # we would want a right curry. map(f,s) # partial(map, f) may be useful. pow(x,y,z) # more likely to want to freeze y or z. range([a],b,[c])# not a good candidate. reduce(f,s,[i]) # could work for operator.add and .mul round(x, n) # we would want a right curry. setattr(o,n,v) # more likely to want to freeze n.
the C implementation is quoted as providing a speed advantage over lambda.
Your recent timings and my old timings show otherwise.
Seriously, not needing to explicitly handle *args and **kw is a genuine readability benefit of partial.
I hope that is not the only real use case. How often do you need to curry a function with lots of positional and keyword arguments? Even when it does arise, it may a code smell indicating that subclassing ought to be used.
Now I wonder. Are my tests invalid, did lambda get faster, or is the "lambda is slow" argument a myth?
The test results are similar to what I got when I had tested the version proposed for Py2.4. The lambda version will win by an even greater margin if you put it on an equal footing by factoring out the attribute lookup with something like f=t.f. Calling Python functions (whether defined with lambda or def) is slower than C function calls because of the time to setup the stack-frame. While partial() saves that cost, it has to spend some time building the new argument tuple and forwarding the call. You're timings show that to be a net loss. Sidenote: Some C methods with exactly zero or one argument have optimized paths that save time spent constructing, passing, and unpacking an argument tuple. Since partial() is aimed at multi-arg functions, that part of "lambda is slower" is not relevant to the comparison.
Hmm, I'm starting to go round in circles here.
I also wish that partial() ran faster than closures, that it didn't have limitations, and that it applied in more situations. C'est le vie. Raymond
On Sat, 26 Feb 2005 19:26:11 -0500, Raymond Hettinger
Are you sure about that? Contriving examples is easy, but download a few modules, scan them for use cases, and you may find, as I did, that partial() rarely applies. The argument order tends to be problematic.
Grepping through the standard library yields no favorable examples.
I also didn't find many the last time I looked through: http://mail.python.org/pipermail/python-list/2004-December/257990.html
In inspect.py, you could replace "formatvarkw=lambda name: '**' + name" with "partial(operator.add, '**') but that would not be an improvement.
Yeah, I remember thinking that the nicer way to write this was probably formatvarkw='**%s'.__mod__ Steve -- You can wordify anything if you just verb it. --- Bucky Katt, Get Fuzzy
Raymond Hettinger wrote:
Are you sure about that? Contriving examples is easy, but download a few modules, scan them for use cases, and you may find, as I did, that partial() rarely applies. The argument order tends to be problematic.
So would you like to see the decision to accept PEP 309 reverted? Regards, Martin
On Sun, 27 Feb 2005 09:31:26 +0100, "Martin v. Löwis"
Raymond Hettinger wrote:
Are you sure about that? Contriving examples is easy, but download a few modules, scan them for use cases, and you may find, as I did, that partial() rarely applies. The argument order tends to be problematic.
So would you like to see the decision to accept PEP 309 reverted?
Either revert the decision, or apply the patch. I don't feel comfortable advocating that the decision be reverted - according to the CVS log, PEP 309 was accepted by Guido on 31 March 2004, so I don't think it's for me to argue against that decision... (I've already stated my position - I don't have any problem with the function as it stands, and speed is not crucial to me so I have no preference between the C and Python implementations. But it's not the end of the world whatever happens...). Paul.
Are you sure about that? Contriving examples is easy, but download a few modules, scan them for use cases, and you may find, as I did, that partial() rarely applies. The argument order tends to be problematic.
So would you like to see the decision to accept PEP 309 reverted?
I would like for the principal advocates to reach a consensus that the proposed implementation is a winner. Ideally, that decision should be informed by trying it out on their own, real code and seeing whether it offers genuine improvements. Along the way, they should assess whether it is as applicable as expected, whether the existing limitations are problematic, and whether performance is an issue. All four issues are in question. My concern is that with Guido having approved the idea in abstract form, the actual implementation has escaped scrutiny. Also, if the API is different from the PEP, acceptance should not be automatic. If functional.partial() isn't a clear winner, it may be a reasonable to ask that it be released in the wild and evolve before being solidified in the standard library. My sense is that that the current implementation is far from its highest state of evolution. Raymond
Raymond Hettinger wrote:
I would like for the principal advocates to reach a consensus that the proposed implementation is a winner.
That I cannot understand. Do you want the advocates to verify that the implementation conforms to the specification? or that the implementation of the PEP is faster than any other existing implementation of the PEP? These two hold, I believe.
Ideally, that decision should be informed by trying it out on their own, real code and seeing whether it offers genuine improvements.
Performance-wise, or usability-wise? Because usability-wise, all implementations of the PEP are identical, so all implementations of the PEP should offer the precisely same improvements.
Along the way, they should assess whether it is as applicable as expected, whether the existing limitations are problematic, and whether performance is an issue.
Ah, so you question the specification, not the implementation of it.
My concern is that with Guido having approved the idea in abstract form, the actual implementation has escaped scrutiny. Also, if the API is different from the PEP, acceptance should not be automatic.
AFAICT, the proposed patch implements the behaviour of the PEP exactly.
If functional.partial() isn't a clear winner, it may be a reasonable to ask that it be released in the wild and evolve before being solidified in the standard library. My sense is that that the current implementation is far from its highest state of evolution.
Again, this I cannot understand. I do believe that there is no better way to implement the PEP. The PEP very explicitly defines what precisely functional.partial is, and the implementation follows that specification very closely. Regards, Martin
On Sun, 27 Feb 2005 19:05:18 +0100, "Martin v. Löwis"
Again, this I cannot understand. I do believe that there is no better way to implement the PEP. The PEP very explicitly defines what precisely functional.partial is, and the implementation follows that specification very closely.
This is where I get confused as well. PEP 309 specifies a function, and this specification has been accepted for inclusion in Python. The discussion seems to centre around whether that acceptance was correct. While I'm not saying that it's too late to attempt to persuade Guido to reverse himself, it does seem to me to be a lot of fuss over a fairly small function - and no-one said anything like this at the time. When I put up 5 reviews to get Martin to look at this, I honestly believed that it was a simple case of an accepted PEP with a complete implementation (admittedly scattered over a couple of SF patches), and would simply be a matter of committing it. IMHO, the burden is on those who want the "Accepted" status revoking to persuade Guido to pronounce to that effect. Otherwise, based on the standard PEP workflow process, it's time to move on, and ensure that the patches provide a complete implementation, and assuming they do to commit them. (But I don't want to put myself up as a big "champion" of PEP 309 - I like it, and I'd like to get the "accepted and there's a patch, but not yet implemented" status resolved, but that's all. I'm not going to switch to Perl if the patch isn't accepted :-)) Paul
Paul Moore wrote:
While I'm not saying that it's too late to attempt to persuade Guido to reverse himself, it does seem to me to be a lot of fuss over a fairly small function - and no-one said anything like this at the time.
I would probably fuss much less if it would not simultaneously introduce a new module as well.
When I put up 5 reviews to get Martin to look at this, I honestly believed that it was a simple case of an accepted PEP with a complete implementation (admittedly scattered over a couple of SF patches), and would simply be a matter of committing it.
That was a fair assumption. However, it turned out that a) people still have doubts about the proposed functionality of the PEP. For some, it does too much, for others, too little. Changing the PEP now would be much cheaper than first committing the changes, and then redoing the PEP again, as we might need to deprecate the functional.partial first. So as part of the review, I need to confirm that there still is no opposition to the PEP (which now appears to be the case) b) it is not obvious that the patch is complete. It probably is, but I would have committed a single patch much quicker than collecting bits and pieces from multiple patches, only to find out that they won't integrate properly. c) it appears that the implementation of the PEP is incorrect (as Raymond just discovered). Again, it is better to require a perfect implementation before committing the changes, instead of pushing the contributor afterwards to add the missing changes.
IMHO, the burden is on those who want the "Accepted" status revoking to persuade Guido to pronounce to that effect.
Most certainly. So far, nobody stepped forward and requested that this status is revoked, so no persuading is necessary. However, as part of the review process, it *is* necessary to check again whether somebody would have preferred that the PEP is revoked - atleast when the acceptance of the PEP is many months old.
Otherwise, based on the standard PEP workflow process, it's time to move on, and ensure that the patches provide a complete implementation, and assuming they do to commit them.
Correct. I would have done so more readily if I knew how the "Accepted" status got into the document. I could have researched that (going through old email archives), or I could just ask whether people agree that the status is indeed "Accepted".
(But I don't want to put myself up as a big "champion" of PEP 309 - I like it, and I'd like to get the "accepted and there's a patch, but not yet implemented" status resolved, but that's all. I'm not going to switch to Perl if the patch isn't accepted :-))
It seems to me that the patch will be committed shortly, assuming somebody corrects the remaining flaws in the implementation. I could do that, but I would prefer if somebody contributed an updated patch. Regards, Martin
Along the way, they should assess whether it is as applicable as expected, whether the existing limitations are problematic, and whether performance is an issue.
Ah, so you question the specification, not the implementation of it.
I do believe that there is no better way to implement the PEP. The PEP very explicitly defines what
My only issue with the PEP is that it seemed much more promising when reading it than when looking for real code that could benefit from it. I liked it much better until I tried to use it. My hope is that the advocates will try it for themselves before pushing this one in on faith. precisely
functional.partial is, and the implementation follows that specification very closely.
My reading of the PEP did not include making the structure members public. This complicates and slows the implementation. The notion of introducing mutable state to the PFA is at odds with the driving forces behind functional programming (i.e. statelessness). If necessary for introspection, the structure members can be made read-only. Also, there may be room to improve the implementation by building on the passed-in dictionary rather than creating a copy of the one in the partial object. The current choice may be the correct one because it has the outer call override the defaults in the event of a keyword conflict -- if so, that should be documented. The test for callability is redundant and can be removed. The traverse() function can be simplified with the PyVISIT macro. Overall, I have no major objections to the PEP or the patch. Before it goes in on auto-pilot, it would be darned nice if the proponents said that they've found it helpful in real code and that they are satisfied with the timings. partial(str.__add__, 'Ray')('mond')
Raymond Hettinger wrote:
My reading of the PEP did not include making the structure members public. This complicates and slows the implementation. The notion of introducing mutable state to the PFA is at odds with the driving forces behind functional programming (i.e. statelessness).
Notice that the C code is (or atleast is meant as) a faithful implementation of the "Example Implementation" in the PEP, including the literal spelling of the class attributes. Now, it is not clear what is meant as normative in the PEP; I would agree that these member names are not meant to be exposed.
If necessary for introspection, the structure members can be made read-only.
This issue is not discussed in the PEP. If exposed, I think I would prefer different names. Starting all names with p_, might be appropriate, and I would rename "fn" to "func" (following method objects). Not sure what names would be appropriate for arguments and keywords. Notice that the proposed documentation says this: """Partial objects are callable objects that are created, and mostly behave, like the functions created by \function{partial}. The main difference is that because they are Python objects, they have attributes that can be inspected or modified.""" So it was atleast the intention of the PEP author that partial functions are mutable.
Also, there may be room to improve the implementation by building on the passed-in dictionary rather than creating a copy of the one in the partial object.
Couldn't this cause the modifications be passed to the caller? This would not be acceptable, but I could not figure out whether CALL_FUNCTION* will always create a new kwdict, or whether it might pass through a dict from the original caller.
The current choice may be the correct one because it has the outer call override the defaults in the event of a keyword conflict -- if so, that should be documented.
Notice that the "Example Implementation" specifies this: if kw and self.kw: d = self.kw.copy() d.update(kw) else: d = kw or self.kw In any case - the fine points of the semantics primarily need to go into the documentation, which currently says """and keyword arguments override those provided when the new function was created."""
Overall, I have no major objections to the PEP or the patch. Before it goes in on auto-pilot, it would be darned nice if the proponents said that they've found it helpful in real code and that they are satisfied with the timings.
I guess "darned nice" is the best you can hope for. Not sure if Peter Harris is still around. Regards, Martin
I did a quick experiment:
python -m timeit -s "from operator import itemgetter; l=range(8)" "itemgetter(1)(l)" 1000000 loops, best of 3: 0.548 usec per loop
python -m timeit -s "l=range(8)" "(lambda x:x[1])(l)" 1000000 loops, best of 3: 0.597 usec per loop
That's far less of a difference than I expected from itemgetter!
You've timed how long it takes to both construct and apply the retrieval function. The relevant part is only the application: C:\pydev>python -m timeit -r9 -s "from operator import itemgetter; s=range(8); f=itemgetter(1)" "f(s)" 1000000 loops, best of 9: 0.806 usec per loop C:\pydev>python -m timeit -r9 -s "s=range(8); f=lambda x:x[1]" "f(s)" 100000 loops, best of 9: 1.18 usec per loop So the savings is about 30% which is neither astronomical, nor negligible. Raymond
Raymond Hettinger wrote:
* The PFA implementation proposed for Py2.4 ran slower than an equivalent closure. If the latest implementation offers better performance, then that may be a reason for having it around.
Not having done the timing, I'll defer to Paul and yourself here. However, one of the proposed enhancements is to automatically flatten out nested partial calls - this won't speed up the basic cases, but will allow incremental construction in two or more stages without a speed loss at the final call.
flags rather than the first argument of a binary operator. Still, I found closures to be more flexible in that they could handle any argument pattern and could freeze more than one variable or keyword at a time.
I'm not sure what this one is about - the PEP 309 implementation allows a single partial call to freeze an arbitrary number of positional arguments (starting from the left), and an arbitrary number of keyword arguments at any position. (This is why the name was changed from curry to partial - it was general purpose partial function application, rather than left currying)
* The instance method limitation never came up for me. However, it bites to have a tool working in a way that doesn't match your mental model. We have to document the limitations, keep them in mind while programming, and hope to remember them as possible causes if bugs ever arise. It would be great if these limitations could get ironed out.
The 'best' idea I've come up with so far is to make partial a class factory instead of a straight class, taking an argument that states how many positional arguments to prepend at call time. A negative value would result in the addition of (len(callargs)+1) to the position at call time. Then, partial() would return a partial application class which appended all positional arguments at call time, partial(-1) a class which prepended all positional arguments. partial(1) would be the equivalent of partialmethod, with the first argument prepended, and the rest appended. In the general case, partial(n)(fn, *args1)(*args2) would give a call that looked like fn(args2[:n] + args1 + args2[n:]) for positive n, and fn(args2[:len(args2)+n+1] + args1 + args2[len(args2)+n+1:]) for negative n. n==0 and n==-1 being obvious candidates for tailored implementations that avoided the unneeded slicing. The presence of keyword arguments at any point wouldn't affect the positional arguments. With this approach, it may be necessary to ditch the flattening of nested partials in the general case. For instance, partial(partial(-1)(fn, c), a)(b) should give an ultimate call that looks like fn(a, b, c). Simple cases where the nested partial application has the same number of prepended arguments as the containing partial application should still permit flattening, though.
* Using the word "partial" instead of "lambda" traded one bit of unreadability for another.
The class does do partial function application though - I thought the name fit pretty well.
* It is not clear that the proposed implementation achieves one of the principal benefits laid out in the PEP: "I agree that lambda is usually good enough, just not always. And I want the possibility of useful introspection and subclassing."
I think it succeeds on the introspection part, since the flattening of nested partials relies on the introspection abilities. Not so much on the subclassing - partialmethod wasn't able to reuse too much functionality from partial.
If we get a better implementation, it would be nice if the PEP were updated with better examples. The TkInter example is weak because we often want to set multiple defaults at the same time (foreground, background, textsize, etc) and often those values are config options rather than hardwired constants.
Hmm - the PEP may give a misleading impression of what the current implementation can and can't do. It's already significantly more flexible than what you mention here. For instance, most of the examples you give below could be done using keyword arguments. That's likely to be rather slow though, since you end up manipulating dictionaries rather than tuples, so I won't pursue that aspect. Instead, I'm curious how many of them could be implemented using positional arguments and the class factory approach described above: cmp(x,y) # partial()(cmp, refobject) divmod(x,y) # partial(-1)(y) filter(p,s) # partial()(filter, p) getattr(o,n,d) # partial(-1)(getattr, n, d) hasattr(o,n) # partial(-1)(hasttr, n) int(x,b) # partial(-1)(int, b) isinstance(o,c) # partial(-1)(isinstance, c) issubclass(a,b) # partial(-1)(issubclass, b) iter(o,s) # partial(-1)(iter, s) long(x,b) # partial(-1)(long, b) map(f,s) # partial()(map, f) pow(x,y,z) # partial(1)(pow, y) OR partial(-1)(pow, z) # OR partial(-1)(pow, y, z) range([a],b,[c])# partial(-1)(range, c) (Default step other than 1) # Always need to specify start, though reduce(f,s,[i]) # partial()(f round(x, n) # we would want a right curry. setattr(o,n,v) # partial(1)(setattr, n) Essentially, partial() gives left curry type behaviour, partial(-1) gives right curry behaviour, and partial(n) let's you default an argument in the middle. For positive n, the number is the index of the first argument locked, for negative n it is the index of the last argument that is locked. An argument could be made for providing the names 'leftpartial' (fixing left-hand arguments) and 'rightpartial' (fixing right-hand arguments) as aliases for partial() and partial(-1) respectively. Regards, Nick. -- Nick Coghlan | ncoghlan@email.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net
Nick Coghlan
Raymond Hettinger wrote:
* The instance method limitation never came up for me. However, it bites to have a tool working in a way that doesn't match your mental model. We have to document the limitations, keep them in mind while programming, and hope to remember them as possible causes if bugs ever arise. It would be great if these limitations could get ironed out.
The 'best' idea I've come up with so far is to make partial a class factory instead of a straight class, taking an argument that states how many positional arguments to prepend at call time. A negative value would result in the addition of (len(callargs)+1) to the position at call time.
Other dynamic languages, like Lisp, are in the same boat in this respect--real currying (and things that look like it) doesn't work too well because the API wasn't designed with it in mind (curried languages have functions that take less-frequently-changing parameters first, but most other languages take them last). Scheme has a nice solution for this in SRFI 26 (http://srfi.schemers.org/srfi-26/). It looks like this: (cut vector-set! x <> 0) That produces a function that takes one argument. The <> is an argument slot; for every <> in the cut form, the resultant callable takes another argument. (This explanation is incomplete, and there are some other features; read the SRFI for details.) I've been using something similar in Python for a while, and I really like it. It doesn't look as good because the slot has to be a real object and not punctuation, but it works just as well. For example: cut(islice, cutslot, 0, 2) That's pretty readable to me. My version also allows the resultant callable to take any number of parameters after the slots have been satisfied, so partial is just the special case of no explicit slots. Perhaps a full example will make it clearer: >>> def test(a, b, c): ... print 'a', a, 'b', b, 'c', c ... >>> f = cut(test, cutslot, 'bravo') >>> f('alpha', 'charlie') a alpha b bravo c charlie Here, b is specialized at cut time, a is passed through the slot, and c is passed through the implicit slots at the end. The only thing this can't do is a generic right-"curry"--where we don't know how many parameters come before the one we want to specialize. If someone wants to do that, they're probably better off using keyword arguments. So far, my most common use for this is to specialize the first argument to map, zip, or reduce. Very few cases actually need an explicit cutslot, but those that do (like the islice example above) look pretty good with it. My reasons for using cut instead of lambda are usually cosmetic--the cut form is shorter and reads better when what I'm doing would be a curry in a language designed for that. Throw in a compose function and I almost never need to use lambda in a decorator <ducks and runs from the anti-lambda crowd>.
Dima Dorfman wrote:
Nick Coghlan
wrote: Here, b is specialized at cut time, a is passed through the slot, and c is passed through the implicit slots at the end. The only thing this can't do is a generic right-"curry"--where we don't know how many parameters come before the one we want to specialize. If someone wants to do that, they're probably better off using keyword arguments.
I think Raymond posted some decent examples using the builtins where having binding of the last few arguments on the right with decent performance would be desirable. As you yourself said - Python functions tend to have the arguments one is most likely to want to lock down on the right of the function signature, rather than on the left. The current PEP 309 certainly supports that in the form of keyword arguments, but anyone interested in performance is going to revert back to the lambda solution. The class factory approach relies on shaping the partial application of the arguments by specifying where the call time positional arguments are to be placed (with 'all at the start', 'all at the end' and 'one at the start, rest at the end' being the most common options). Regards, Nick. -- Nick Coghlan | ncoghlan@email.com | Brisbane, Australia --------------------------------------------------------------- http://boredomandlaziness.skystorm.net
participants (7)
-
"Martin v. Löwis"
-
Dima Dorfman
-
Nick Coghlan
-
Paul Moore
-
Raymond Hettinger
-
Raymond Hettinger
-
Steven Bethard