On 2020-11-25 22:07, Guido van Rossum wrote:
Hmm... In the end I think the language design issue is with functions (and how they capture variable references rather than values), and fixing it by changing the for-loop is still just a band-aid, with other problems and inconsistencies. Agreed that the fix 'x=x' essentially always works, and that having to write 'for new x in ...' is no better, because you'd still forget it.
Maybe we should consider introducing a new kind of function that captures values instead? That would be a much more principled fix.
I think something along those lines makes a lot more sense to me,
After following this thread I'm still not convinced changing the for loop is really a good idea. I basically agree with Steven D'Aprano that the supposed benefits of "protecting" against accidental name shadowing are marginal at best and don't justify complicating matters. I also agree that the for loop isn't really the issue. For instance, if we "fix" for loops, the name-binding issue will still exist for functions in while loops.
For me the only problem with the `x=x` default argument "fix" is that. . . well, it's in the argument signature. That means it effectively becomes part of the function's public API, and people can pass in other values, which maybe you don't want. Plus it just seems very unclean to me to have to shoehorn "I want to capture these variables from the enclosing scope at definition time" into the mechanism for "these are the arguments you can pass at call time". What we really want is a way to separate these into orthogonal choices.
So what I've been wondering is, could we provide a syntax *other than default arguments* to explicitly indicate that a function wants to capture specific variables from the enclosing scope? Something like:
def f(a, b, c) with x, y: # x and y here have whatever values they had in the enclosing scope at definition time
The advantages I see to this are:
1) It makes explicit the separation between "capture this value at definition time" and "allow passing this at call time". 2) It maintains the convention that we don't "peek" inside the function body at definition time (i.e., all the variables to be captured are "outside" in the def line) 3) It narrowly targets specific variables. It's probably not a good idea for a function to just capture ALL variables from outer scopes (for instance, you usually don't want to capture values of global config variables that might change before call time). 4) It is backwards compatible, doesn't change any existing behavior
It is new syntax but if we use "with" we could avoid grabbing a new keyword.
I don't know enough about the internals of Python to know how this would be implemented, but my idea is that it would effectively create new local variables for x and y, but not put them in the argument list (so they'd be in co_varnames but not co_argcount).
Basically what I'm saying here is that I don't want block scopes. Adding block scopes as a thing will just complicate the language as block-scoped variables could then be used for all manner of things besides "freezing" them into functions. For me the issue is only about how function scopes capture variables from enclosing scopes; we don't need new scopes, we just need new ways to control that capturing behavior.