For-loop variable scope: simultaneous possession and ingestion of cake

Why does the evil default args hack work? Because it immediately evaluates the argument and stores the result into the lambda object's default values list. So, what we need is a keyword that means "immediately evaluate the argument and store the result, instead of looking up the name again later." Since I can't think of a good name for this, I will use a terrible name for it, "immanentize."
Etc. Behind the scenes, this would look like syntatic sugar for something like creating a new variable name, evaluating the expression at initial compile time, setting the variable name to be the result of the evaluation, and replacing the immanentize expression with the variable name. Like this: the second loop random_variable_name_1 = 1, etc. ... lst.append(lambda: random_variable_name_0..9) ... # ie. the first loop is evaluated as lst.append(lambda(random_variable_name_0)), etc. ...
[f() for f in _] [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
I think this proposal is better than using "local:" or "scope:" because it doesn't create nested blocks when you just want to freeze out one particular value. One interesting side effect of having an immanentize keyword is that in Python 4000, we could (if we wanted to) get rid of the supposed "wart" of having x=[] as a default arg leading to unexpected results for Python newbies. Just make it so that to get the current behavior you type
Whereas, without immanentize we can do what newbies expect and evaluate the defaults afresh each time.
Obviously, "immanentize" is a terrible name, and only just barely a word of English. Perhaps we could call it an "anonymous variable" instead? -- Carl

Oh, I thought of something. What if you try to immanentize a variable in function that only exists in the function's scope? I think that should cause an error to be raised. So, this should work:
But this should not:
-- Carl

Um, I think this is more complicated. Consider: i = 0 def f(): i += 1 return lambda: i Now lambda: i is bound to i so every time I call f it will return a function that returns the current value of i not the one at the time f was called. So I can fix this by immanetization: i = 0 def f(): i += 1 return lambda: immanentize i This will return lambda: 1, then lambda: 2, etc. right? No. It returns lambda: 0, lambda: 0, etc. --- Bruce

On 2008/10/05, at 6:47 pm, Bruce Leban wrote:
I'm not sure I see what you're getting at. In Python 2.6 and 3.0rc1 this raises "UnboundLocalError: local variable 'i' referenced before assignment." If you want to do what it looks like you want to do, you have to use "nonlocal i" or "global i".
To me, it is transparently clear that this will return lambda: 0 every time. That's what immanentization does. If you want lambda: 1, etc., use "nonlocal i". -- Carl

I noticed something based on some code in the other thread, but it's not really related to it. Is there a reason for this not working:
Versus:
This was probably already discussed at the time "nonlocal" was invented, but is there a specific reason that "nonlocal" can't be used in cases where the next scope out is the same as "global"? I naively assumed that you could use them almost interchangeably if you were at the top level of a module. ("Almost" because "global" adds the variable to the module namespace if it's not already there, whereas "nonlocal" doesn't blithely add variables to other scopes, but just goes looking for existing ones.) Why force me to switch to "global" when I cut and paste a function out of a class or whatever and put it at the top level of my module? Is it just so that TOOOWTDI? Or was it an oversight? Or is there some other reason? Thanks, -- Carl

On Sun, Oct 5, 2008 at 11:26 PM, Carl Johnson <carl@carlsensei.com> wrote:
Because nonlocal is not global. The whole point of nonlocal is it falls between global and global.
You said it already: "cut and paste". Having nonlocal != global helps catch some potential bugs. Plus you just don't gain anything worth losing that assistance in finding coding errors by having nonlocal act like global. -Brett

On Sun, Oct 5, 2008 at 10:16 PM, Carl Johnson <carl@carlsensei.com> wrote:
Yup. I wrote that a bit too quickly.
Consider this: i = 0 def f(): global i i += 1 return lambda: immanentize 1 when does immanentize get evaluated? when f is defined or when the lambda is evaluated? From what you wrote, it sounds like you think it's evaluated when f is defined. OK, so how do I get the equivalent of: def f(): global i i += 1 lambda i=i: i using immanentize?

On 2008/10/05, at 9:08 pm, Bruce Leban wrote:
OK, now I see what you're getting at. That makes more sense. The question is how do we deal with nested scopes with an immanentize in the innermost scope. Off the top of my head, I think the most sensible way to do it is that the immanentization happens when the innermost scope it's in is turned into a real function by the next scope out. But I may need to do some more thinking about what would happen here: def f(): xs = [] for i in range(10): xs.append(lambda: immanentize i) return xs Clearly, we want this immanentize to be held off on until f is finally called. The more I think about it, the more I realize that an immanentize always needs to be inside of some kind of function declaration, whether it's a def or a lambda or (maybe) a class (but I haven't thought about the details of that yet…), and evaluated just one level "up" from that.
In which case, for your original example, the immanentization wouldn't happen until f is called.
I think that makes sense, right? :/ Does anyone know how Lisp does this? I know that they write '(blah) to delay evaluation, but what do they do when they want to force evaluation someplace in the middle of a non-evaluated list? -- Carl

Hmm, it's a nice new keyword doing interesting stuff, but I would prefer solving the scoping problems (I do consider it a problem) without introducing new keywords and such. But the right time to do the imannentization would be as late as possible, when the innermost scope is turned into a function by its parent scope, like Carl proposes. In Lisp, '(blah) can be evaluated by doing (eval '(blah)). In order to 'imannentize' the value of some variable into a quoted piece of code, you can do two things. Since code is just lists, you can use the standard list slice and dice functions. They als have some syntax for this which I think goes like this: `(blah foo ,bar baz) This returns a list that looks like this: (blah foo <the value of bar at the time the list was created> baz) But I don't think we should go all the way to quasiquoting (as this is called) unless we want to support full macros, and that isn't going to get past Guido. Jan 2008/10/6 Carl Johnson <carl@carlsensei.com>:

Carl Johnson wrote:
Actually, until the lambda is executed. What you are saying is that you want immanetized expressions to be evaluated as the same time the default arg expressions are, which is when the def/lambda is executed to create a function object. The only difference between them and default args is that they could not be replaced by the function call, which is precise the problem with default pseudoargs. Call them constants defined at definition time rather than as compilation time. Do you want the constants to be names or anonymous -- which is to say, would there values appear in locals()? If named, their expressions could/should appear in the header with a syntax similar to but somehow different from default args. A possibility: ...lambda $i=i: i To implement this, the constants would have to be stored in the function object and not in the compiled code object that is shared by all functions generated by repeated execution of the definition.
Immanetize would have be a no-op at top-level, as is global. Whether it should be illegal at top-level or not is a different questions. Global is legal even though it is redundant. If the constants are named and their definitions are in the function header, there would be no question of top-level appearance. Terry Jan Reedy

On 2008/10/6 Terry Reedy <tjreedy@udel.edu> wrote:
I have some vague memory that these might be called &aux variables in Common Lisp.
Would giving them a __name and marking them as keyword-only meet the goal? (As best I can tell, it does for methods, but not for top-level functions, because of the way __mangling works.) -jJ

&aux is described here: http://www.lispworks.com/documentation/HyperSpec/Body/03_dae.htm this says it's equivalent to let* which is described here: http://www.lispworks.com/documentation/HyperSpec/Body/s_let_l.htm In short &aux and let* evaluates each expression, assigns it to a variable and then evaluates the next, etc. Default values in python are evaluated in like Lisp's let, not let*. --- Bruce On Tue, Oct 7, 2008 at 5:39 PM, Jim Jewett <jimjjewett@gmail.com> wrote:

On Tue, Oct 7, 2008 at 9:57 PM, Bruce Leban <bruce@leapyear.org> wrote:
How do you figure? As nearly as I can tell, the only difference is that let* is evaluated in order (left-to-right) instead of in parallel. Python parameters are also evaluated left-to-right, as nearly as I can tell. >>> def f(): global var var="from f" >>> var="base" >>> def h(): print "var is", var >>> def g(a=f(), b=h()): print b var is from f This shows that the side effect of binding a was already present when b was bound. -jJ

Lisp's let: evaluate, evaluate, evaluate, assign, assign, assign Lisp's let*: evaluate, assign, evaluate, assign, evaluate, assign In Python as in Lisp, the side effects of the first evaluation are visible to the second but in Python and Lisp's let (vs. let*) the assignment of the first variable doesn't happen until after all the expressions have been evaluated.
def f(i=0, j=i+1): pass
Traceback (most recent call last): File "<pyshell#2>", line 1, in <module> def f(i=0, j=i+1): NameError: name 'i' is not defined On Wed, Oct 8, 2008 at 9:33 AM, Jim Jewett <jimjjewett@gmail.com> wrote:

Jim Jewett wrote:
That's not quite right. The difference between let and let* is that each expression in a let* is evaluated in a scope that includes the names bound by the previous expressions. In other words, it's equivalent to a nested sequence of let statements.
It's not about side effects, it's about name visibility. If the binding of function arguments worked like let*, then you would be able to refer to the name a in the expression being assigned to b, i.e. this would be legal: def g(a = f(), b = h(a)): ... But it's not -- you would get a NameError on a if you tried that. In that respect it's like let, not let*. -- Greg

Carl Johnson wrote:
'@' would mostly work, '$' is available, and is used for string substitution in other languages. ...
As you said above, immanetize mean evaluate immediate, just as with default arg expressions, so immanetize can hardly mean anything extra when applied to default arg expressions. So you really need a new 'calltime' keyword for not immediate execution. Unless, of course, you are proposing that *all* default arg expressions be, by default, repeatedly evaluated at each call (thereby breaking all code depending on define-time evaluation).
Only some newbies expect this. The ones like me who get that default args are evaluated just *once* never post complaints to c.l.p. It gives a completely biased sample. In any case, I don't see how you expect this to actually work. What object would you have put into the default arg tuple? What about a=[] def f(x=a): x.append(1) return x Would you have this magically modified also? Suppose instead of 'a=[]' we have 'from mod import a'. What about other mutable objects? Terry Jan Reedy

Carl Johnson wrote:
Yep, that is terrible! ;-) I'm really starting to see this as a non-problem after reading all these posts. This particular problem is solved much nicer with a class than a function. I have two reasons for this, one is classes are the natural structure to use if you want to save a state. And the other is I would actually prefer that functions never save states, or even closures. (but that would break a lot of decorators.) Here is an example that works as expected with no magic or hidden behaviors. class Caller(object): def __init__(self, f, *args, **kwds): self.f = f self.args = args self.kwds = kwds def __call__(self): return self.f(*self.args, **self.kwds) def id(i):return i L = [] for n in range(10): L.append(Caller(id, n)) for f in L: f() You could go one step further and make the class more specific to the situation it is being used by defining the __call__ method to do what you want instead of using a stored function reference. Something I think would be more beneficial to solve is to be able to pack and unpack entire function arguments into one signature object easily and automatically. def foo(***aks): ... Ron

Oh, I thought of something. What if you try to immanentize a variable in function that only exists in the function's scope? I think that should cause an error to be raised. So, this should work:
But this should not:
-- Carl

Um, I think this is more complicated. Consider: i = 0 def f(): i += 1 return lambda: i Now lambda: i is bound to i so every time I call f it will return a function that returns the current value of i not the one at the time f was called. So I can fix this by immanetization: i = 0 def f(): i += 1 return lambda: immanentize i This will return lambda: 1, then lambda: 2, etc. right? No. It returns lambda: 0, lambda: 0, etc. --- Bruce

On 2008/10/05, at 6:47 pm, Bruce Leban wrote:
I'm not sure I see what you're getting at. In Python 2.6 and 3.0rc1 this raises "UnboundLocalError: local variable 'i' referenced before assignment." If you want to do what it looks like you want to do, you have to use "nonlocal i" or "global i".
To me, it is transparently clear that this will return lambda: 0 every time. That's what immanentization does. If you want lambda: 1, etc., use "nonlocal i". -- Carl

I noticed something based on some code in the other thread, but it's not really related to it. Is there a reason for this not working:
Versus:
This was probably already discussed at the time "nonlocal" was invented, but is there a specific reason that "nonlocal" can't be used in cases where the next scope out is the same as "global"? I naively assumed that you could use them almost interchangeably if you were at the top level of a module. ("Almost" because "global" adds the variable to the module namespace if it's not already there, whereas "nonlocal" doesn't blithely add variables to other scopes, but just goes looking for existing ones.) Why force me to switch to "global" when I cut and paste a function out of a class or whatever and put it at the top level of my module? Is it just so that TOOOWTDI? Or was it an oversight? Or is there some other reason? Thanks, -- Carl

On Sun, Oct 5, 2008 at 11:26 PM, Carl Johnson <carl@carlsensei.com> wrote:
Because nonlocal is not global. The whole point of nonlocal is it falls between global and global.
You said it already: "cut and paste". Having nonlocal != global helps catch some potential bugs. Plus you just don't gain anything worth losing that assistance in finding coding errors by having nonlocal act like global. -Brett

On Sun, Oct 5, 2008 at 10:16 PM, Carl Johnson <carl@carlsensei.com> wrote:
Yup. I wrote that a bit too quickly.
Consider this: i = 0 def f(): global i i += 1 return lambda: immanentize 1 when does immanentize get evaluated? when f is defined or when the lambda is evaluated? From what you wrote, it sounds like you think it's evaluated when f is defined. OK, so how do I get the equivalent of: def f(): global i i += 1 lambda i=i: i using immanentize?

On 2008/10/05, at 9:08 pm, Bruce Leban wrote:
OK, now I see what you're getting at. That makes more sense. The question is how do we deal with nested scopes with an immanentize in the innermost scope. Off the top of my head, I think the most sensible way to do it is that the immanentization happens when the innermost scope it's in is turned into a real function by the next scope out. But I may need to do some more thinking about what would happen here: def f(): xs = [] for i in range(10): xs.append(lambda: immanentize i) return xs Clearly, we want this immanentize to be held off on until f is finally called. The more I think about it, the more I realize that an immanentize always needs to be inside of some kind of function declaration, whether it's a def or a lambda or (maybe) a class (but I haven't thought about the details of that yet…), and evaluated just one level "up" from that.
In which case, for your original example, the immanentization wouldn't happen until f is called.
I think that makes sense, right? :/ Does anyone know how Lisp does this? I know that they write '(blah) to delay evaluation, but what do they do when they want to force evaluation someplace in the middle of a non-evaluated list? -- Carl

Hmm, it's a nice new keyword doing interesting stuff, but I would prefer solving the scoping problems (I do consider it a problem) without introducing new keywords and such. But the right time to do the imannentization would be as late as possible, when the innermost scope is turned into a function by its parent scope, like Carl proposes. In Lisp, '(blah) can be evaluated by doing (eval '(blah)). In order to 'imannentize' the value of some variable into a quoted piece of code, you can do two things. Since code is just lists, you can use the standard list slice and dice functions. They als have some syntax for this which I think goes like this: `(blah foo ,bar baz) This returns a list that looks like this: (blah foo <the value of bar at the time the list was created> baz) But I don't think we should go all the way to quasiquoting (as this is called) unless we want to support full macros, and that isn't going to get past Guido. Jan 2008/10/6 Carl Johnson <carl@carlsensei.com>:

Carl Johnson wrote:
Actually, until the lambda is executed. What you are saying is that you want immanetized expressions to be evaluated as the same time the default arg expressions are, which is when the def/lambda is executed to create a function object. The only difference between them and default args is that they could not be replaced by the function call, which is precise the problem with default pseudoargs. Call them constants defined at definition time rather than as compilation time. Do you want the constants to be names or anonymous -- which is to say, would there values appear in locals()? If named, their expressions could/should appear in the header with a syntax similar to but somehow different from default args. A possibility: ...lambda $i=i: i To implement this, the constants would have to be stored in the function object and not in the compiled code object that is shared by all functions generated by repeated execution of the definition.
Immanetize would have be a no-op at top-level, as is global. Whether it should be illegal at top-level or not is a different questions. Global is legal even though it is redundant. If the constants are named and their definitions are in the function header, there would be no question of top-level appearance. Terry Jan Reedy

On 2008/10/6 Terry Reedy <tjreedy@udel.edu> wrote:
I have some vague memory that these might be called &aux variables in Common Lisp.
Would giving them a __name and marking them as keyword-only meet the goal? (As best I can tell, it does for methods, but not for top-level functions, because of the way __mangling works.) -jJ

&aux is described here: http://www.lispworks.com/documentation/HyperSpec/Body/03_dae.htm this says it's equivalent to let* which is described here: http://www.lispworks.com/documentation/HyperSpec/Body/s_let_l.htm In short &aux and let* evaluates each expression, assigns it to a variable and then evaluates the next, etc. Default values in python are evaluated in like Lisp's let, not let*. --- Bruce On Tue, Oct 7, 2008 at 5:39 PM, Jim Jewett <jimjjewett@gmail.com> wrote:

On Tue, Oct 7, 2008 at 9:57 PM, Bruce Leban <bruce@leapyear.org> wrote:
How do you figure? As nearly as I can tell, the only difference is that let* is evaluated in order (left-to-right) instead of in parallel. Python parameters are also evaluated left-to-right, as nearly as I can tell. >>> def f(): global var var="from f" >>> var="base" >>> def h(): print "var is", var >>> def g(a=f(), b=h()): print b var is from f This shows that the side effect of binding a was already present when b was bound. -jJ

Lisp's let: evaluate, evaluate, evaluate, assign, assign, assign Lisp's let*: evaluate, assign, evaluate, assign, evaluate, assign In Python as in Lisp, the side effects of the first evaluation are visible to the second but in Python and Lisp's let (vs. let*) the assignment of the first variable doesn't happen until after all the expressions have been evaluated.
def f(i=0, j=i+1): pass
Traceback (most recent call last): File "<pyshell#2>", line 1, in <module> def f(i=0, j=i+1): NameError: name 'i' is not defined On Wed, Oct 8, 2008 at 9:33 AM, Jim Jewett <jimjjewett@gmail.com> wrote:

Jim Jewett wrote:
That's not quite right. The difference between let and let* is that each expression in a let* is evaluated in a scope that includes the names bound by the previous expressions. In other words, it's equivalent to a nested sequence of let statements.
It's not about side effects, it's about name visibility. If the binding of function arguments worked like let*, then you would be able to refer to the name a in the expression being assigned to b, i.e. this would be legal: def g(a = f(), b = h(a)): ... But it's not -- you would get a NameError on a if you tried that. In that respect it's like let, not let*. -- Greg

Carl Johnson wrote:
'@' would mostly work, '$' is available, and is used for string substitution in other languages. ...
As you said above, immanetize mean evaluate immediate, just as with default arg expressions, so immanetize can hardly mean anything extra when applied to default arg expressions. So you really need a new 'calltime' keyword for not immediate execution. Unless, of course, you are proposing that *all* default arg expressions be, by default, repeatedly evaluated at each call (thereby breaking all code depending on define-time evaluation).
Only some newbies expect this. The ones like me who get that default args are evaluated just *once* never post complaints to c.l.p. It gives a completely biased sample. In any case, I don't see how you expect this to actually work. What object would you have put into the default arg tuple? What about a=[] def f(x=a): x.append(1) return x Would you have this magically modified also? Suppose instead of 'a=[]' we have 'from mod import a'. What about other mutable objects? Terry Jan Reedy

Carl Johnson wrote:
Yep, that is terrible! ;-) I'm really starting to see this as a non-problem after reading all these posts. This particular problem is solved much nicer with a class than a function. I have two reasons for this, one is classes are the natural structure to use if you want to save a state. And the other is I would actually prefer that functions never save states, or even closures. (but that would break a lot of decorators.) Here is an example that works as expected with no magic or hidden behaviors. class Caller(object): def __init__(self, f, *args, **kwds): self.f = f self.args = args self.kwds = kwds def __call__(self): return self.f(*self.args, **self.kwds) def id(i):return i L = [] for n in range(10): L.append(Caller(id, n)) for f in L: f() You could go one step further and make the class more specific to the situation it is being used by defining the __call__ method to do what you want instead of using a stored function reference. Something I think would be more beneficial to solve is to be able to pack and unpack entire function arguments into one signature object easily and automatically. def foo(***aks): ... Ron
participants (8)
-
Brett Cannon
-
Bruce Leban
-
Carl Johnson
-
Greg Ewing
-
Jan Kanis
-
Jim Jewett
-
Ron Adam
-
Terry Reedy