
A few months back there was a discussion of how code like this gives "surprising" results because of the scoping rules:
Various syntax changes were proposed to get around this, but nothing ever came of it. Also recently, I tried to propose a new syntax to allow Ruby-like blocks in Python without sacrificing Python's indenting rules. My idea was that "@" would mean "placeholder for a function to be defined on the next line" like so:
Thinking about it some more, I've realized that the change I proposed is unnecessary, since we already have the decorator syntax. So, for example, the original function can be made to act with the "expected" scoping by using an each_in function defined as follows:
On the one hand, I can imagine some people thinking this is decorator abuse, since the each_in decorator produces a list and not a function. If so, I suppose the lambda might be changed to
In either version the important thing is that it provides a scoped version of a for-loop, which those from a C++ background might be more conditioned to expect. Possibly, such a function could be added to the functools or the itertools. It would be useful for when scoping issues arise, for example when adding a bunch of properties or attributes to a class. Thinking about it some more though, it's hard to see why such a trivial function is needed for the library. There's no reason it couldn't just be done as an inline lambda instead:
OK, actually there is one reason this couldn't be done as a lambda:
This is because the decorator grammar asks for a name, not an expression, as one (well, OK, me a couple months ago) might naively expect. decorator: '@' dotted_name [ '(' [arglist] ')' ] NEWLINE decorators: decorator+ decorated: decorators (classdef | funcdef) and not decorator: '@' test NEWLINE decorators: decorator+ decorated: decorators (classdef | funcdef) Changing the grammar would also allow for the rewriting of the sorted example given earlier:
Of course, the use of lambda decorator is strictly speaking unnecessary,
But, I think I think that allowing lambda decorators would be convenient for a number of situations and it would give a simple answer to those asking for a multiline lambda or Ruby-like blocks. What do other people think? -- Carl

On Sun, Feb 8, 2009 at 4:41 PM, Carl Johnson <carl@carlsensei.com> wrote:
I'm sorry, but you are using two nested lambdas plus a list comprehension, and three nested functions here, plus one more list comprehension for showing the result. My brain hurts trying to understand all this. I don't think this bodes well as a use case for a proposed feature. I'm not trying to be sarcastic here -- I really think this code is too hard to follow for a motivating example.
-- --Guido van Rossum (home page: http://www.python.org/~guido/)

On 2009/02/08, at 5:29 pm, Guido van Rossum wrote:
I will admit that this is getting a bit too functional-language-like for its own good, but (ignoring my proposed solution for a while) at least in the case of the nested scope problem, what other choices is there but to nest functions in order to keep the variable from varying? I myself was bitten by the variable thing when I wanted to write a simple for-loop to add methods to a class, but because of the scoping issue, it ended up that all of the methods were equivalent to the last in the list. At present, there's no way around this but to write a slightly confusing series of nested functions. So, that's my case for an each_in function. Going back to the @lambda thing, in general, people are eventually going to run into different situations where they have to write their own decorators. We already have a lot of different situations taken care of for us by the functools and itertools, but I don't think the library will ever be able to do everything for everyone. I think that the concept of writing decorators (as opposed to using them) is definitely confusing at first, since you're nesting one thing inside of another and then flipping it all around, and it doesn't entirely make sense, but the Python community seems to have adapted to it. For that matter, if I think too much about what happens in a series of nested generators, I can confuse myself, but each individual generator makes sense as a kind of "pipe" through which data is flowing and being transformed. Similarly, metaclasses are hard to understand but make things easy to use. So, I guess the key is to keep the parts easy to understand even if how the parts work together is a bit circuitous. Maybe @lambda isn't accomplishing that, but I'm not sure how much easier to understand equivalent solutions would be that is written without it. Easy things easy, hard things possible? -- Carl

On Sun, 08 Feb 2009 18:08:25 -1000, Carl Johnson wrote: <snip> I can't really comprehend the feature you're describing (TLDR), but knowing that decorator is merely a syntax sugar for function calling to a function: @lambdafunc def func(foo): pass is equal to def func(foo): pass func = lambdafunc(func) why don't you do this instead: lambdafunc(lambda foo: -foo) it is perfectly readable and is a more functional approach than decorator. Also for the first example you gave:
why not? (untested)
Closure is already confusing enough for most people. Dynamic creation of closure, which is allowed because of python's way of defining function is an abuse of closure IMHO.

On Sun, Feb 8, 2009 at 8:08 PM, Carl Johnson <carl@carlsensei.com> wrote:
Are you unaware of or rejecting the solution of using a default argument value? [(lambda x, _i=i: x+i) for i in range(10)] is a list of 10 functions that add i to their argument, for i in range(10).
I don't see what @lambda does that you can't already do with several other forms of syntax. The reason for adding decorators to the language is to have easier syntax for common manipulations/modifications of functions and methods. A decorator using lambda would be a one-off, which kind of defeats the purpose. For example, instead of this:
I would write this: def func_maker(): def fi(i): def f(): return i fs = [fi(i) for i in range(10)] return fs In regard to the proposal of "bind i" syntax, I have a counter-proposal (as log as we're in free association mode :-): Define new 'for' syntax so that you can write [lambda: i for new i in range(10)] or e.g. fs = [] for new i in range(10): def f(): return i fs.append(f) The rule would be that "for new <name> in ..." defines a new "cell" each time around the loop, whose scope is limited to the for loop. So e.g. this wouldn't work: for new i in range(10): if i%7 == 6: break print i # NameError I'm not saying I like this all that much, but it seems a more Pythonic solution than "bind i", and it moves the special syntax closer to the source of the problem. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

On Mon, Feb 09, 2009 at 09:09:18AM -0800, Guido van Rossum wrote:
[(lambda x, _i=i: x+i) for i in range(10)]
Either [(lambda x, i=i: x+i) for i in range(10)] or [(lambda x, _i=i: x+_i) for i in range(10)] :-) Oleg. -- Oleg Broytmann http://phd.pp.ru/ phd@phd.pp.ru Programmers don't die, they just GOSUB without RETURN.

On Mon, Feb 09, 2009, Guido van Rossum wrote:
Nice! This does seem like it would work to quell the complaints about for loop scoping while still maintaining current semantics for those who rely on them. Anyone wanna write a PEP for this?... Anyone who does tackle this should make sure the PEP addresses this: for new x, y in foo: versus for new x, new y in foo: (I'll channel Guido and say that the second form will be rejected; we don't want to allow multiple loop targets to have different scope. IOW, ``for new`` becomes its own construct, functionally speaking.) -- Aahz (aahz@pythoncraft.com) <*> http://www.pythoncraft.com/ Weinberg's Second Law: If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization.

Aahz wrote:
Personally, I think when ever a state or data is to be stored, objects should be used instead of trying to get functions to do what objects already do. Here's something that looks like it works, but...
It was only worked because I reused i as a variable name.
I've always thought closures cause more problems than they solve, but I doubt closures can be removed because of both how many programs are already written that use them, and because of the number of programmers who like them. One of the reasons I dislike closures is a function that uses a closure looks exactly like a function that uses a global value and as you see above is susceptible to side effects depending on how and when they are used. Guido's 'new' syntax would help, but I'm undecided on weather it's a good feature or not. I'd prefer to have an optional feature to turn on closures myself. Maybe an "__allow_closures__ = True" at the beginning of a program? Shrug, Ron

Le Mon, 9 Feb 2009 09:09:18 -0800, Guido van Rossum <guido@python.org> a écrit :
The difference I see between such an iteration-specific loop variable and a "declarative" version is that in the latter case it is possible to choose which name(s), among the ones that depend on the loop var, will actually get one "cell" per iteration -- or not. Hem, maybe it's not clear... For instance, using a declaration, it may be possible to write the following loop (I do not pretend 'local' to be a good lexical choice ;-): funcs = [] for item in seq: local prod # this name only is iteration specific prod = product(item) # ==> one prod per item def f(): return prod funcs.append(f) if test(item): final = whatever(item) # non-local name break print "func results:\n%s\nend result:%s" \ %([f() for f in funcs],final) I do not like global & non-local declaration (do not fit well the overall python style, imo). So I would not like such a proposal, neither. But I like the idea to select names rather than a loop-level on/off switch. Denis ------ la vida e estranya

On Mon, Feb 9, 2009 at 10:05 AM, spir <denis.spir@free.fr> wrote:
You could use "new prod" for this. Or you could use "var prod" and change the "for" proposal to "for var". It's a slippery slope though -- how often do you really need this vs. how much more confusion does it cause. You would need rules telling the scope of such a variable: with "for new" that is easy, but what if "var foo" is not nested inside a loop? Then it would essentially become a no-op. Or would it introduce a scope out of the nearest indented block? This quickly becomes too hairy to be attractive. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
In my version of it, 'new' doesn't affect scoping at all -- the scope of the name that it qualifies is the same as it would be if the 'new' wasn't there. It's not necessarily true that it would be a no-op outside of a loop: x = 17 def f(): ... # sees x == 17 new x = 42 def g(): ... # sees x == 42 although practical uses for that are probably rather thin on the ground.
This quickly becomes too hairy to be attractive.
It all seems rather clear and straightforward to me, once you understand what it does. Some RTFMing on the part of newcomers would be required, but nothing worse than what we already have with things like generators, decorators and with-statements. -- Greg

Le Tue, 10 Feb 2009 18:02:27 +1300, Greg Ewing <greg.ewing@canterbury.ac.nz> a écrit :
Yes, this is different indeed. As Guido's example was to tag the loop variable itself with the kw 'new', then all names inside the loop body depending on it would also be "iteration specific". So that it would not be possible anymore to retrieve final state data outsid the loop as is commonly done in python. Your proposal solves the problems. Yet, as Guido noted, it is probably not a big issue enough. ------ la vida e estranya

Arnaud Delobelle wrote:
Before you get too carried away with that, I'd like to make it clear that in my version of the semantics of 'new', it *not* work that way -- 'a' would remain bound to the last cell, and therefore the last value, that it was given by the for-loop. I considered that a feature of my proposal, i.e. it preserves the ability to use the loop variable's value after exiting the loop, which some people seem to like. -- Greg

On Sun, Feb 08, 2009, Guido van Rossum wrote:
Maybe I've been corrupted by the functional mindset (I hope not!), but I can follow this. As Carl says, the key is to focus on the scoping problem: def func_maker(): fs = [] for i in range(10): def f(): return i fs.append(f) return fs This creates a list of function objects, but the way scoping works in Python, every single function object returns 9 because that's the final value of ``i`` in func_maker(). Living with this wart is an option; changing Python's scoping rules to fix the wart is probably not worth considering. Carl is suggesting something in-between, allowing the use of lambda combined with decorator syntax to create an intermediate scope that hides func_maker()'s ``i`` from f(). I think that's ugly, but it probably is about the best we can do -- the other options are uglier. What I don't know is whether it's worth considering; abusing scope is not generally considered Pythonic, but we've been slowly catering to that market over the years. Given my general antipathy to decorators as obfuscating code, I don't think Carl's suggestion causes much damage. Carl, let me know if I've accurately summarized you. -- Aahz (aahz@pythoncraft.com) <*> http://www.pythoncraft.com/ Weinberg's Second Law: If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization.

On Sun, Feb 8, 2009 at 9:40 PM, Aahz <aahz@pythoncraft.com> wrote: <much snippage>
This will likely get shot down in 5 minutes somehow, but I just want to put it forward since I personally didn't find it ugly: Has a declaration been considered (like with `nonlocal`)? Seems a relatively elegant solution to this problem, imho. I know it was mentioned in a thread once (with the keyword as "instantize" or "immediatize" or something similar), but I've been unable to locate it (my Google-chi must not be flowing today). The basic idea was (using the keyword "bind" for the sake of argument): def func_maker(): fs = [] for i in range(10): def f(): bind i #this declaration tells Python to grab the value of i as it is *right now* at definition-time return i fs.append(f) return fs The `bind` declaration would effectively tell Python to apply an internal version of the `lambda x=x:` trick/hack/kludge automatically, thus solving the scoping problem. I now expect someone will point me to the thread I couldn't find and/or poke several gaping holes in this idea. Cheers, Chris -- Follow the path of the Iguana... http://rebertia.com

On Sun, Feb 8, 2009 at 11:02 PM, Chris Rebert <pyideas@rebertia.com> wrote:
How about it's icky? It feels like it's evaluating part of the inner function when the function is defined. There's a simple solution, which is to put it in the argument list: def f(bind i): return i However, when compared to the existing def f(i=i): it just seems too petty to be worth the increased language complexity. That's where it dies: too little benefit with too large a cost. The only way I'd reconsider it is if default values were changed to reevaluate with each call, but nobody should be proposing that again until at least Python 4000, and only then if they come up with a much better argument than we have today. I doubt it'll ever happen. -- Adam Olsen, aka Rhamphoryncus

On Mon, Feb 9, 2009 at 1:36 AM, Adam Olsen <rhamph@gmail.com> wrote:
That much I completely agree with, and I speak from personal experience :-) Cheers, Chris -- Follow the path of the Iguana... http://rebertia.com

On Mon, Feb 9, 2009 at 07:02, Chris Rebert <pyideas@rebertia.com> wrote:
When is "now"? The bind, whatever it does, is not executed until the function is called, which is too late. Letting it influence the interpretation of f itself is confusing. The problem stems from the fact that local variables are only local to the enclosing function, without the possibility of finer scopes. If variable declarations were explicit and scoped to the enclosing block, this would work: def func_maker(): var fs = [] for i in range(10): var j = i fs.append(lambda: j) return fs or possibly even the original, depending on the semantics of for (whether it binds a new variable on each iteration, scoped over the body, or it rebinds the same variable each time). -- Marcin Kowalczyk qrczak@knm.org.pl http://qrnik.knm.org.pl/~qrczak/

On Mon, Feb 9, 2009 at 2:21 AM, Marcin 'Qrczak' Kowalczyk <qrczak@knm.org.pl> wrote:
Well, it's a declaration of sorts, so it would hypothetically be taken into account at definition-time rather than call-time (that's it's entire raison detre!); surely we can agree that 'global' can also be viewed as having definition-time rather than call-time effects? It is somewhat confusing, but then so is the problem it's trying to solve (the unintuitive behavior of functions defined in a loop). But yes, I agree the idea has been thoroughly thrashed (as I predicted) and at least now the discussion is preserved for posterity should similar ideas ever resurface. :) Cheers, Chris -- Follow the path of the Iguana... http://rebertia.com

On Mon, Feb 9, 2009 at 6:27 AM, Arnaud Delobelle <arnodel@googlemail.com> wrote:
Yes that is, thanks! As the starter of the thread said, "immanentize" really is a terrible name. :) http://mail.python.org/pipermail/python-ideas/2008-October/002149.html Cheers, Chris -- Follow the path of the Iguana... http://rebertia.com

Carl Johnson wrote:
A few months back there was a discussion of how code like this gives "surprising" results because of the scoping rules:
And solutions were given.
Because a) there is already a trivial way to get the result wanted; b) proposals are wrapped in trollish claims; c) perhaps no proposer is really serious. To me, one pretty obvious way to define default non-parameters would be to follow the signature with "; <name = expr>+". Where is the PEP, though? Enough already. ...
Such a mess to avoid using the current syntax Like Guido, my head hurt trying to read it, so I quit.
What do other people think?
-1 Terry Jan Reedy

Le Mon, 09 Feb 2009 01:03:39 -0500, Terry Reedy <tjreedy@udel.edu> a écrit :
To me, one pretty obvious way to define default non-parameters would be to follow the signature with "; <name = expr>+". Where is the PEP, though?
I guess mean the following?
As I understand it, the issue only happens when yielding functions. For instance, the following works as expected: class C(object): def __init__(self,n): self.n = n def obj_maker(): objs = [] for i in range(10): obj = C(i) objs.append(obj) return objs The pseudo-parameter trick used to generate funcs is only a workaround, but it works fine and is not overly complicated (I guess). Maybe this case should be documented in introductory literature; not only as a possible trap, also because it helps understanding python's (non-)scoping rules which allow (this is not be obvious when expecting iteration-specific scope): # name 'item' like silently introduced here for item in container: if test(item): break print item Denis ------ la vida e estranya

Carl Johnson wrote:
This isn't really a scoping rule problem. It occurs because the value of i in f() isn't evaluated until f is called, which is after the loop is finished.
Here i's default value is set at the time f() is defined, so it avoids the issue. ra@Gutsy:~$ python Python 2.5.2 (r252:60911, Oct 5 2008, 19:29:17) [GCC 4.3.2] on linux2 Type "help", "copyright", "credits" or "license" for more information.
def f(): return i ...
Notice that we can define a function with an i variable even before i exists. The requirement is that i exists when the function f is called. Another way to do this is to use a proper objective approach.
Because the value is stored on the object at the time the object is created, vs when it is called, this works as expected. For clarity and readability, I would change the __init__ line to accept a proper argument for i, but as you see it still works. Cheers, Ron

Le Mon, 09 Feb 2009 08:32:46 -0600, Ron Adam <rrr@ronadam.com> a écrit :
I see this formulation equivalent to "def f(i=i0): return i", I mean conceptually. From an OO point of view, a default/keyword arg is indeed an (implicit) attribute of a function, evaluated at definition/creation time. Actually, isn't the above code an excellent explanation of what a default argument is in python? Denis ------ la vida e estranya

On Sun, Feb 8, 2009 at 4:41 PM, Carl Johnson <carl@carlsensei.com> wrote:
I'm sorry, but you are using two nested lambdas plus a list comprehension, and three nested functions here, plus one more list comprehension for showing the result. My brain hurts trying to understand all this. I don't think this bodes well as a use case for a proposed feature. I'm not trying to be sarcastic here -- I really think this code is too hard to follow for a motivating example.
-- --Guido van Rossum (home page: http://www.python.org/~guido/)

On 2009/02/08, at 5:29 pm, Guido van Rossum wrote:
I will admit that this is getting a bit too functional-language-like for its own good, but (ignoring my proposed solution for a while) at least in the case of the nested scope problem, what other choices is there but to nest functions in order to keep the variable from varying? I myself was bitten by the variable thing when I wanted to write a simple for-loop to add methods to a class, but because of the scoping issue, it ended up that all of the methods were equivalent to the last in the list. At present, there's no way around this but to write a slightly confusing series of nested functions. So, that's my case for an each_in function. Going back to the @lambda thing, in general, people are eventually going to run into different situations where they have to write their own decorators. We already have a lot of different situations taken care of for us by the functools and itertools, but I don't think the library will ever be able to do everything for everyone. I think that the concept of writing decorators (as opposed to using them) is definitely confusing at first, since you're nesting one thing inside of another and then flipping it all around, and it doesn't entirely make sense, but the Python community seems to have adapted to it. For that matter, if I think too much about what happens in a series of nested generators, I can confuse myself, but each individual generator makes sense as a kind of "pipe" through which data is flowing and being transformed. Similarly, metaclasses are hard to understand but make things easy to use. So, I guess the key is to keep the parts easy to understand even if how the parts work together is a bit circuitous. Maybe @lambda isn't accomplishing that, but I'm not sure how much easier to understand equivalent solutions would be that is written without it. Easy things easy, hard things possible? -- Carl

On Sun, 08 Feb 2009 18:08:25 -1000, Carl Johnson wrote: <snip> I can't really comprehend the feature you're describing (TLDR), but knowing that decorator is merely a syntax sugar for function calling to a function: @lambdafunc def func(foo): pass is equal to def func(foo): pass func = lambdafunc(func) why don't you do this instead: lambdafunc(lambda foo: -foo) it is perfectly readable and is a more functional approach than decorator. Also for the first example you gave:
why not? (untested)
Closure is already confusing enough for most people. Dynamic creation of closure, which is allowed because of python's way of defining function is an abuse of closure IMHO.

On Sun, Feb 8, 2009 at 8:08 PM, Carl Johnson <carl@carlsensei.com> wrote:
Are you unaware of or rejecting the solution of using a default argument value? [(lambda x, _i=i: x+i) for i in range(10)] is a list of 10 functions that add i to their argument, for i in range(10).
I don't see what @lambda does that you can't already do with several other forms of syntax. The reason for adding decorators to the language is to have easier syntax for common manipulations/modifications of functions and methods. A decorator using lambda would be a one-off, which kind of defeats the purpose. For example, instead of this:
I would write this: def func_maker(): def fi(i): def f(): return i fs = [fi(i) for i in range(10)] return fs In regard to the proposal of "bind i" syntax, I have a counter-proposal (as log as we're in free association mode :-): Define new 'for' syntax so that you can write [lambda: i for new i in range(10)] or e.g. fs = [] for new i in range(10): def f(): return i fs.append(f) The rule would be that "for new <name> in ..." defines a new "cell" each time around the loop, whose scope is limited to the for loop. So e.g. this wouldn't work: for new i in range(10): if i%7 == 6: break print i # NameError I'm not saying I like this all that much, but it seems a more Pythonic solution than "bind i", and it moves the special syntax closer to the source of the problem. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

On Mon, Feb 09, 2009 at 09:09:18AM -0800, Guido van Rossum wrote:
[(lambda x, _i=i: x+i) for i in range(10)]
Either [(lambda x, i=i: x+i) for i in range(10)] or [(lambda x, _i=i: x+_i) for i in range(10)] :-) Oleg. -- Oleg Broytmann http://phd.pp.ru/ phd@phd.pp.ru Programmers don't die, they just GOSUB without RETURN.

On Mon, Feb 09, 2009, Guido van Rossum wrote:
Nice! This does seem like it would work to quell the complaints about for loop scoping while still maintaining current semantics for those who rely on them. Anyone wanna write a PEP for this?... Anyone who does tackle this should make sure the PEP addresses this: for new x, y in foo: versus for new x, new y in foo: (I'll channel Guido and say that the second form will be rejected; we don't want to allow multiple loop targets to have different scope. IOW, ``for new`` becomes its own construct, functionally speaking.) -- Aahz (aahz@pythoncraft.com) <*> http://www.pythoncraft.com/ Weinberg's Second Law: If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization.

Aahz wrote:
Personally, I think when ever a state or data is to be stored, objects should be used instead of trying to get functions to do what objects already do. Here's something that looks like it works, but...
It was only worked because I reused i as a variable name.
I've always thought closures cause more problems than they solve, but I doubt closures can be removed because of both how many programs are already written that use them, and because of the number of programmers who like them. One of the reasons I dislike closures is a function that uses a closure looks exactly like a function that uses a global value and as you see above is susceptible to side effects depending on how and when they are used. Guido's 'new' syntax would help, but I'm undecided on weather it's a good feature or not. I'd prefer to have an optional feature to turn on closures myself. Maybe an "__allow_closures__ = True" at the beginning of a program? Shrug, Ron

Le Mon, 9 Feb 2009 09:09:18 -0800, Guido van Rossum <guido@python.org> a écrit :
The difference I see between such an iteration-specific loop variable and a "declarative" version is that in the latter case it is possible to choose which name(s), among the ones that depend on the loop var, will actually get one "cell" per iteration -- or not. Hem, maybe it's not clear... For instance, using a declaration, it may be possible to write the following loop (I do not pretend 'local' to be a good lexical choice ;-): funcs = [] for item in seq: local prod # this name only is iteration specific prod = product(item) # ==> one prod per item def f(): return prod funcs.append(f) if test(item): final = whatever(item) # non-local name break print "func results:\n%s\nend result:%s" \ %([f() for f in funcs],final) I do not like global & non-local declaration (do not fit well the overall python style, imo). So I would not like such a proposal, neither. But I like the idea to select names rather than a loop-level on/off switch. Denis ------ la vida e estranya

On Mon, Feb 9, 2009 at 10:05 AM, spir <denis.spir@free.fr> wrote:
You could use "new prod" for this. Or you could use "var prod" and change the "for" proposal to "for var". It's a slippery slope though -- how often do you really need this vs. how much more confusion does it cause. You would need rules telling the scope of such a variable: with "for new" that is easy, but what if "var foo" is not nested inside a loop? Then it would essentially become a no-op. Or would it introduce a scope out of the nearest indented block? This quickly becomes too hairy to be attractive. -- --Guido van Rossum (home page: http://www.python.org/~guido/)

Guido van Rossum wrote:
In my version of it, 'new' doesn't affect scoping at all -- the scope of the name that it qualifies is the same as it would be if the 'new' wasn't there. It's not necessarily true that it would be a no-op outside of a loop: x = 17 def f(): ... # sees x == 17 new x = 42 def g(): ... # sees x == 42 although practical uses for that are probably rather thin on the ground.
This quickly becomes too hairy to be attractive.
It all seems rather clear and straightforward to me, once you understand what it does. Some RTFMing on the part of newcomers would be required, but nothing worse than what we already have with things like generators, decorators and with-statements. -- Greg

Le Tue, 10 Feb 2009 18:02:27 +1300, Greg Ewing <greg.ewing@canterbury.ac.nz> a écrit :
Yes, this is different indeed. As Guido's example was to tag the loop variable itself with the kw 'new', then all names inside the loop body depending on it would also be "iteration specific". So that it would not be possible anymore to retrieve final state data outsid the loop as is commonly done in python. Your proposal solves the problems. Yet, as Guido noted, it is probably not a big issue enough. ------ la vida e estranya

Arnaud Delobelle wrote:
Before you get too carried away with that, I'd like to make it clear that in my version of the semantics of 'new', it *not* work that way -- 'a' would remain bound to the last cell, and therefore the last value, that it was given by the for-loop. I considered that a feature of my proposal, i.e. it preserves the ability to use the loop variable's value after exiting the loop, which some people seem to like. -- Greg

On Sun, Feb 08, 2009, Guido van Rossum wrote:
Maybe I've been corrupted by the functional mindset (I hope not!), but I can follow this. As Carl says, the key is to focus on the scoping problem: def func_maker(): fs = [] for i in range(10): def f(): return i fs.append(f) return fs This creates a list of function objects, but the way scoping works in Python, every single function object returns 9 because that's the final value of ``i`` in func_maker(). Living with this wart is an option; changing Python's scoping rules to fix the wart is probably not worth considering. Carl is suggesting something in-between, allowing the use of lambda combined with decorator syntax to create an intermediate scope that hides func_maker()'s ``i`` from f(). I think that's ugly, but it probably is about the best we can do -- the other options are uglier. What I don't know is whether it's worth considering; abusing scope is not generally considered Pythonic, but we've been slowly catering to that market over the years. Given my general antipathy to decorators as obfuscating code, I don't think Carl's suggestion causes much damage. Carl, let me know if I've accurately summarized you. -- Aahz (aahz@pythoncraft.com) <*> http://www.pythoncraft.com/ Weinberg's Second Law: If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization.

On Sun, Feb 8, 2009 at 9:40 PM, Aahz <aahz@pythoncraft.com> wrote: <much snippage>
This will likely get shot down in 5 minutes somehow, but I just want to put it forward since I personally didn't find it ugly: Has a declaration been considered (like with `nonlocal`)? Seems a relatively elegant solution to this problem, imho. I know it was mentioned in a thread once (with the keyword as "instantize" or "immediatize" or something similar), but I've been unable to locate it (my Google-chi must not be flowing today). The basic idea was (using the keyword "bind" for the sake of argument): def func_maker(): fs = [] for i in range(10): def f(): bind i #this declaration tells Python to grab the value of i as it is *right now* at definition-time return i fs.append(f) return fs The `bind` declaration would effectively tell Python to apply an internal version of the `lambda x=x:` trick/hack/kludge automatically, thus solving the scoping problem. I now expect someone will point me to the thread I couldn't find and/or poke several gaping holes in this idea. Cheers, Chris -- Follow the path of the Iguana... http://rebertia.com

On Sun, Feb 8, 2009 at 11:02 PM, Chris Rebert <pyideas@rebertia.com> wrote:
How about it's icky? It feels like it's evaluating part of the inner function when the function is defined. There's a simple solution, which is to put it in the argument list: def f(bind i): return i However, when compared to the existing def f(i=i): it just seems too petty to be worth the increased language complexity. That's where it dies: too little benefit with too large a cost. The only way I'd reconsider it is if default values were changed to reevaluate with each call, but nobody should be proposing that again until at least Python 4000, and only then if they come up with a much better argument than we have today. I doubt it'll ever happen. -- Adam Olsen, aka Rhamphoryncus

On Mon, Feb 9, 2009 at 1:36 AM, Adam Olsen <rhamph@gmail.com> wrote:
That much I completely agree with, and I speak from personal experience :-) Cheers, Chris -- Follow the path of the Iguana... http://rebertia.com

On Mon, Feb 9, 2009 at 07:02, Chris Rebert <pyideas@rebertia.com> wrote:
When is "now"? The bind, whatever it does, is not executed until the function is called, which is too late. Letting it influence the interpretation of f itself is confusing. The problem stems from the fact that local variables are only local to the enclosing function, without the possibility of finer scopes. If variable declarations were explicit and scoped to the enclosing block, this would work: def func_maker(): var fs = [] for i in range(10): var j = i fs.append(lambda: j) return fs or possibly even the original, depending on the semantics of for (whether it binds a new variable on each iteration, scoped over the body, or it rebinds the same variable each time). -- Marcin Kowalczyk qrczak@knm.org.pl http://qrnik.knm.org.pl/~qrczak/

On Mon, Feb 9, 2009 at 2:21 AM, Marcin 'Qrczak' Kowalczyk <qrczak@knm.org.pl> wrote:
Well, it's a declaration of sorts, so it would hypothetically be taken into account at definition-time rather than call-time (that's it's entire raison detre!); surely we can agree that 'global' can also be viewed as having definition-time rather than call-time effects? It is somewhat confusing, but then so is the problem it's trying to solve (the unintuitive behavior of functions defined in a loop). But yes, I agree the idea has been thoroughly thrashed (as I predicted) and at least now the discussion is preserved for posterity should similar ideas ever resurface. :) Cheers, Chris -- Follow the path of the Iguana... http://rebertia.com

On Mon, Feb 9, 2009 at 6:27 AM, Arnaud Delobelle <arnodel@googlemail.com> wrote:
Yes that is, thanks! As the starter of the thread said, "immanentize" really is a terrible name. :) http://mail.python.org/pipermail/python-ideas/2008-October/002149.html Cheers, Chris -- Follow the path of the Iguana... http://rebertia.com

Carl Johnson wrote:
A few months back there was a discussion of how code like this gives "surprising" results because of the scoping rules:
And solutions were given.
Because a) there is already a trivial way to get the result wanted; b) proposals are wrapped in trollish claims; c) perhaps no proposer is really serious. To me, one pretty obvious way to define default non-parameters would be to follow the signature with "; <name = expr>+". Where is the PEP, though? Enough already. ...
Such a mess to avoid using the current syntax Like Guido, my head hurt trying to read it, so I quit.
What do other people think?
-1 Terry Jan Reedy

Le Mon, 09 Feb 2009 01:03:39 -0500, Terry Reedy <tjreedy@udel.edu> a écrit :
To me, one pretty obvious way to define default non-parameters would be to follow the signature with "; <name = expr>+". Where is the PEP, though?
I guess mean the following?
As I understand it, the issue only happens when yielding functions. For instance, the following works as expected: class C(object): def __init__(self,n): self.n = n def obj_maker(): objs = [] for i in range(10): obj = C(i) objs.append(obj) return objs The pseudo-parameter trick used to generate funcs is only a workaround, but it works fine and is not overly complicated (I guess). Maybe this case should be documented in introductory literature; not only as a possible trap, also because it helps understanding python's (non-)scoping rules which allow (this is not be obvious when expecting iteration-specific scope): # name 'item' like silently introduced here for item in container: if test(item): break print item Denis ------ la vida e estranya

Carl Johnson wrote:
This isn't really a scoping rule problem. It occurs because the value of i in f() isn't evaluated until f is called, which is after the loop is finished.
Here i's default value is set at the time f() is defined, so it avoids the issue. ra@Gutsy:~$ python Python 2.5.2 (r252:60911, Oct 5 2008, 19:29:17) [GCC 4.3.2] on linux2 Type "help", "copyright", "credits" or "license" for more information.
def f(): return i ...
Notice that we can define a function with an i variable even before i exists. The requirement is that i exists when the function f is called. Another way to do this is to use a proper objective approach.
Because the value is stored on the object at the time the object is created, vs when it is called, this works as expected. For clarity and readability, I would change the __init__ line to accept a proper argument for i, but as you see it still works. Cheers, Ron

Le Mon, 09 Feb 2009 08:32:46 -0600, Ron Adam <rrr@ronadam.com> a écrit :
I see this formulation equivalent to "def f(i=i0): return i", I mean conceptually. From an OO point of view, a default/keyword arg is indeed an (implicit) attribute of a function, evaluated at definition/creation time. Actually, isn't the above code an excellent explanation of what a default argument is in python? Denis ------ la vida e estranya
participants (14)
-
Aahz
-
Adam Olsen
-
Arnaud Delobelle
-
Bruce Frederiksen
-
Carl Johnson
-
Chris Rebert
-
Greg Ewing
-
Guido van Rossum
-
Lie Ryan
-
Marcin 'Qrczak' Kowalczyk
-
Oleg Broytmann
-
Ron Adam
-
spir
-
Terry Reedy