
Hi all -- i've been reading the enormous thread on nested scopes with some concern, since i would very much like Python to support "proper" lexical scoping, yet i also care about compatibility. There is something missing from my understanding here: - The model is, each environment has a pointer to the enclosing environment, right? - Whenever you can't find what you're looking for, you go up to the next level and keep looking, right? - So what's the issue with not being able to determine which variable binds in which scope? With the model just described, it's perfectly clear. Is all this breakage only caused by the particular optimizations for lookup in the implementation (fast locals, etc.)? Or have i missed something obvious? I could probably go examine the source code of the nested scoping changes to find the answer to my own question, but in case others share this confusion with me, i thought it would be worth asking. * * * Consider for a moment the following simple model of lookup: 1. A scope maps names to objects. 2. Each scope except the topmost also points to a parent scope. 3. To look up a name, first ask the current scope. 4. When lookup fails, go up to the parent scope and keep looking. I believe the above rules are common among many languages and are commonly understood. The only Python-specific parts are then: 5. The current scope is determined by the nearest enclosing 'def'. 6. These statements put a binding into the current scope: assignment (=), def, class, for, except, import And that's all. * * * Given this model, all of the scoping questions that have been raised have completely clear answers: Example I >>> y = 3 >>> def f(): ... print y ... >>> f() 3 Example II >>> y = 3 >>> def f(): ... print y ... y = 1 ... print y ... >>> f() 3 1 >>> y 3 Example III >>> y = 3 >>> def f(): ... exec "y = 2" ... def g(): ... return y ... return g() ... >>> f() 2 Example IV >>> m = open('foo.py', 'w') >>> m.write('x = 1') >>> m.close() >>> def f(): ... x = 3 ... from foo import * ... def g(): ... print x ... g() ... >>> f() 1 In Example II, the model addresses even the current situation that sometimes surprises new users of Python. Examples III and IV are the current issues of contention about nested scopes. * * * It's good to start with a simple model for the user to understand; the implementation can then do funky optimizations under the covers so long as the model is preserved. So for example, if the compiler sees that there is no "import *" or "exec" in a particular scope it can short-circuit the lookup of local variables using fast locals. But the ability of the compiler to make this optimization should only affect performance, not affect the Python language model. The model described above is the approximately the one available in Scheme. It exactly reflects the environment-diagram model of scoping as taught to most Scheme students and i would argue that it is the easiest to explain. Some implementations of Scheme, such as STk, do what is described above. UMB scheme does what Python does now: the use-before-binding of 'y' in Example II would cause an error. I was surprised that these gave different behaviours; it turns out that the Scheme standard actually forbids the use of internal defines not at the beginning of a function body, thus sidestepping the issue. But we can't do this in Python; assignment must be allowed anywhere. Given that internal assignment has to have some meaning, the above meaning makes the most sense to me. -- ?!ng