Title: Statically Nested Scopes
Author: Jeremy Hylton
[pre-PEP] This will break code... I'm not sure whether it's worth going down this path just for the sake of being able to define functions within functions. Wouldn't it be a better idea to somehow add native acqusition to Python's objects ? We already have a slot which implements the "contains" relationship. All we'd need is a way for a contained object to register itself with the container in a way that doesn't produce cycles. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
"MAL" == M -A Lemburg
writes:
MAL> [pre-PEP] This will break code... I'm not sure whether it's MAL> worth going down this path just for the sake of being able to MAL> define functions within functions. How will this break code? Any code written to use the scoping rules will not work today. Python already allows programs to define functions within functions. That's not at issue. The issue is how hard it is to use nested functions, including lambdas. MAL> Wouldn't it be a better idea to somehow add native acqusition MAL> to Python's objects ? No. Seriously, I don't see how acquistion addresses the same issues at all. Feel free to explain what you mean. MAL> We already have a slot which implements the "contains" MAL> relationship. All we'd need is a way for a contained object to MAL> register itself with the container in a way that doesn't MAL> produce cycles. The contains relationship has to do with container objects and their contents. A function's environment is not a container in the same sense, so I don't see how this is related. As I noted in the PEP, I don't see a compelling reason to avoid cycles. Jeremy
Jeremy Hylton wrote:
"MAL" == M -A Lemburg
writes: MAL> [pre-PEP] This will break code... I'm not sure whether it's MAL> worth going down this path just for the sake of being able to MAL> define functions within functions.
How will this break code? Any code written to use the scoping rules will not work today.
Python already allows programs to define functions within functions. That's not at issue. The issue is how hard it is to use nested functions, including lambdas.
The problem is that with nested scoping, a function defined within another function will suddenly reference the variables of the enclosing function as globals and not the module globals... this could break code. Another problem is that you can't reach out for the defining module globals anymore (at least not in an easy way like today).
MAL> Wouldn't it be a better idea to somehow add native acqusition MAL> to Python's objects ?
No.
Seriously, I don't see how acquistion addresses the same issues at all. Feel free to explain what you mean.
It's not related to *statically* nested scopes, but is to dynamically nested ones. Acquisition is basically about the same thing: you acquire attributes from containers. The only difference here is that the containment relationships are defined at run-time.
MAL> We already have a slot which implements the "contains" MAL> relationship. All we'd need is a way for a contained object to MAL> register itself with the container in a way that doesn't MAL> produce cycles.
The contains relationship has to do with container objects and their contents. A function's environment is not a container in the same sense, so I don't see how this is related.
As I noted in the PEP, I don't see a compelling reason to avoid cycles.
Ok, we have cycle GC, but why create cycles when you don't have to (Python can also be run without GC and then you'd run into problems...) ? -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
"MAL" == M -A Lemburg
writes:
MAL> [pre-PEP] This will break code... I'm not sure whether it's MAL> worth going down this path just for the sake of being able to MAL> define functions within functions.
How will this break code? Any code written to use the scoping rules will not work today.
Python already allows programs to define functions within functions. That's not at issue. The issue is how hard it is to use nested functions, including lambdas.
MAL> The problem is that with nested scoping, a function defined MAL> within another function will suddenly reference the variables MAL> of the enclosing function as globals and not the module MAL> globals... this could break code. That's right it could, in the unlikely case that someone has existing code today using nested functions where an intermediate function defines a local variable that shadows a variable defined in an enclosing scope. It should be straightfoward to build a tool that would detect this case. It would be pretty poor programming style, so I think it would be fine to break backwards compatibility here. MAL> Another problem is that you can't reach out for the defining MAL> module globals anymore (at least not in an easy way like MAL> today). I think we would want to keep globals implemented just the way they are. The compiler would need to determine exactly which variables are access from enclosing scopes and which are globals. MAL> Wouldn't it be a better idea to somehow add native acqusition MAL> to Python's objects ?
Seriously, I don't see how acquistion addresses the same issues at all. Feel free to explain what you mean.
MAL> It's not related to *statically* nested scopes, but is to MAL> dynamically nested ones. Acquisition is basically about the MAL> same thing: you acquire attributes from containers. The only MAL> difference here is that the containment relationships are MAL> defined at run-time. Static scoping and dynamic scoping are entirely different beasts, useful for different things. I want to fix, among other things, lambdas. That's a static issue. MAL> We already have a slot which implements the "contains" MAL> relationship. All we'd need is a way for a contained object to MAL> register itself with the container in a way that doesn't MAL> produce cycles.
The contains relationship has to do with container objects and their contents. A function's environment is not a container in the same sense, so I don't see how this is related.
As I noted in the PEP, I don't see a compelling reason to avoid cycles.
MAL> Ok, we have cycle GC, but why create cycles when you don't have MAL> to (Python can also be run without GC and then you'd run into MAL> problems...) ? If we can avoid cycles, sure. I would prefer a simple design that allowed cycles to a complicated design that avoided them. Exactly where to draw the line between simple and complex is a matter of taste, of course. Jeremy
"M.-A. Lemburg"
The problem is that with nested scoping, a function defined within another function will suddenly reference the variables of the enclosing function
This could be avoided by requiring that variables which are to be visible in an inner scope be marked somehow in the scope where they are defined. I don't think it's a serious enough problem to be worth fixing that way, though. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
Greg Ewing wrote:
"M.-A. Lemburg"
: The problem is that with nested scoping, a function defined within another function will suddenly reference the variables of the enclosing function
This could be avoided by requiring that variables which are to be visible in an inner scope be marked somehow in the scope where they are defined.
I don't think it's a serious enough problem to be worth fixing that way, though.
It may not look serious, but changing the Python lookup scheme is, since many inspection tools rely and reimplement exactly that scheme. With nested scopes, there would be next to no way to emulate the lookups using these tools. To be honest, I don't think static nested scopes buy us all that much. You can do the same now, by using keyword arguments which isn't all that nice, but works great and makes the scope clearly visible. Dynamic nested scopes is another topic... those are *very* useful; especially when it comes to avoiding global variables and implementing programs which work using control objects instead of global function calls. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
[MAL]
Dynamic nested scopes is another topic... those are *very* useful; especially when it comes to avoiding global variables and implementing programs which work using control objects instead of global function calls.
Marc-Andre, what are Dynamic nested scopes? --Guido van Rossum (home page: http://www.python.org/~guido/)
On Wed, 1 Nov 2000, Guido van Rossum wrote:
[MAL]
Dynamic nested scopes is another topic... those are *very* useful; especially when it comes to avoiding global variables and implementing programs which work using control objects instead of global function calls.
Marc-Andre, what are Dynamic nested scopes?
If MAL means dynamic scoping (which I understood he does), then this
simply means:
when looking for a variable "foo", you first search for it in the local
namespace. If not there, the *caller's* namespace, and so on. In the
end, the caller is the __main__ module, and if not found there, it is
a NameError.
--
Moshe Zadka
Moshe Zadka wrote:
On Wed, 1 Nov 2000, Guido van Rossum wrote:
[MAL]
Dynamic nested scopes is another topic... those are *very* useful; especially when it comes to avoiding global variables and implementing programs which work using control objects instead of global function calls.
Marc-Andre, what are Dynamic nested scopes?
If MAL means dynamic scoping (which I understood he does), then this simply means:
when looking for a variable "foo", you first search for it in the local namespace. If not there, the *caller's* namespace, and so on. In the end, the caller is the __main__ module, and if not found there, it is a NameError.
That would be one application, yes. With dynamic scoping I meant that the context of a lookup is defined at run-time and by explicitely or implicitely hooking together objects which then define the nesting. Environment acquisition is an example of such dynamic scoping: attribute lookups are passed on to the outer scope in case they don't resolve on the inner scope, e.g. say you have object a with a.x = 1; all other objects don't define .x. Then a.b.c.d.x will result in lookups 1. a.b.c.d.x 2. a.b.c.x 3. a.b.x 4. a.x -> 1 This example uses attribute lookup -- the same can be done for other nested objects by explicitely specifying the nesting relationship. Jim's ExtensionClasses allow the above by using a lot of wrappers around objects -- would be nice if we could come up with a more general scheme which then also works for explicit nesting relationships (e.g. dictionaries which get hooked together -- Jim's MultiMapping does this). -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
[MAL]
Dynamic nested scopes is another topic... those are *very* useful; especially when it comes to avoiding global variables and implementing programs which work using control objects instead of global function calls.
Marc-Andre, what are Dynamic nested scopes?
[Moshe]
If MAL means dynamic scoping (which I understood he does), then this simply means:
when looking for a variable "foo", you first search for it in the local namespace. If not there, the *caller's* namespace, and so on. In the end, the caller is the __main__ module, and if not found there, it is a NameError.
Ah, yuck. For variable namespace, this is a really bad idea. For certain other things (e.g. try/except blocks, Jeremy's example) it is of course OK. [MAL]
That would be one application, yes.
With dynamic scoping I meant that the context of a lookup is defined at run-time and by explicitely or implicitely hooking together objects which then define the nesting.
But certainly you're not thinking of doing this to something as basic to Python's semantics as local/global variable lookup!
Environment acquisition is an example of such dynamic scoping: attribute lookups are passed on to the outer scope in case they don't resolve on the inner scope, e.g. say you have object a with a.x = 1; all other objects don't define .x. Then a.b.c.d.x will result in lookups 1. a.b.c.d.x 2. a.b.c.x 3. a.b.x 4. a.x -> 1
This seems only remotely related to dynamic scopes. There are namespaces here, but not "scopes" as I think of them: a scope defines the validity for an unadorned variable. Static scopes mean that this is determined by the program source structure. Dynamic scopes means that that this is determined by run-time mechanisms. Python uses static scoping for local variables, but a dynamic scoping mechanism for non-local variable references: first it does a module-global lookup, then it looks for a built-in name. When we're not talking about simple name lookup, we should speak of namespaces (which can also have lifetimes).
This example uses attribute lookup -- the same can be done for other nested objects by explicitely specifying the nesting relationship.
Jim's ExtensionClasses allow the above by using a lot of wrappers around objects -- would be nice if we could come up with a more general scheme which then also works for explicit nesting relationships (e.g. dictionaries which get hooked together -- Jim's MultiMapping does this).
I think we're getting way off track here. --Guido van Rossum (home page: http://www.python.org/~guido/)
I don't think I buy your explanation that Python uses dynamic scope for resolving globals. As I understand the mechanism, the module namespace and builtins namespace are those for the module in which the function was defined. If so, this is still static scope. Here's a quick example that illustrates the difference: module foo: ----------------------- a = 12 def add(b): return a + b ----------------------- module bar: ----------------------- from foo import add a = -1 print add(1) ----------------------- If Python used static scope, "python bar.py" should print 13 (it does). If it used dynamic scope, I would expect the answer to be 0. Jeremy
I don't think I buy your explanation that Python uses dynamic scope for resolving globals.
That's not what I meant, but I expressed it clumsily. Perhaps the terminology is just inadequate. I simply meant that builtins can be overridden by module-globals. But only in the module whose globals are searched for globals -- that part is still static. Let's drop this thread... --Guido van Rossum (home page: http://www.python.org/~guido/)
[Jeremy]
I don't think I buy your explanation that Python uses dynamic scope for resolving globals.
It's dynamic in the shallow (but real!) sense that it can't be determined to which of {module scope, builtin scope} a global name resolves today until runtime. Indeed, in def f(): print len the resolving scope for "len" may even change with each invocation of f(), depending on who's playing games with the containing module's __dict__. That's not "dynamic scoping" in the proper sense of the term, but it's sure dynamic! words-lead-to-more-words-so-become-one-with-the-essence-instead<wink>-ly y'rs - tim
Moshe's explanation of "dynamic scope" is the definition I've seen in every programming language text I've ever read. The essence of the defintion, I believe, is that a free variable is resolved in the environment created by the current procedure call stack. I think it muddles the discussion to use "dynamic scope" to describe acquistion, though it is a dynamic feature. Python using dynamic scope for exceptions. If any exception is raised, the exception handler that is triggered is determined by the environment in which the procedure was called. There are few languages that use dynamic scoping for normal name resolution. Many early Lisp implementations did, but I think all the modern ones use lexical scoping instead. It is hard to write modular code using dynamic scope, because the behavior of a function with free variables can not be determined by the module that defines it. Not saying it isn't useful, just that it makes it much harder to reason about how a particular modular or function works in isolation from the rest of the system. Jeremy
Moshe's explanation of "dynamic scope" is the definition I've seen in every programming language text I've ever read. The essence of the defintion, I believe, is that a free variable is resolved in the environment created by the current procedure call stack.
Ah. The term "free variable" makes sense here.
I think it muddles the discussion to use "dynamic scope" to describe acquistion, though it is a dynamic feature.
Python using dynamic scope for exceptions. If any exception is raised, the exception handler that is triggered is determined by the environment in which the procedure was called.
Then I think this is also muddles the discussion, since the look for exception handlers has nothing to do with free variable lookup.
There are few languages that use dynamic scoping for normal name resolution. Many early Lisp implementations did, but I think all the modern ones use lexical scoping instead. It is hard to write modular code using dynamic scope, because the behavior of a function with free variables can not be determined by the module that defines it. Not saying it isn't useful, just that it makes it much harder to reason about how a particular modular or function works in isolation from the rest of the system.
I think Python 3000 ought to use totally static scoping. That will make it possible to do optimize code using built-in names! --Guido van Rossum (home page: http://www.python.org/~guido/)
On Thu, 2 Nov 2000, Guido van Rossum wrote:
I think Python 3000 ought to use totally static scoping. That will make it possible to do optimize code using built-in names!
Isn't that another way of saying you want the builtin names to be
part of the language definition? Part of today's method advantages
is that new builtins can be added without any problems.
no-clear-win-either-way-ly y'rs, Z.
--
Moshe Zadka
Moshe Zadka wrote:
On Thu, 2 Nov 2000, Guido van Rossum wrote:
I think Python 3000 ought to use totally static scoping. That will make it possible to do optimize code using built-in names!
Isn't that another way of saying you want the builtin names to be part of the language definition? Part of today's method advantages is that new builtins can be added without any problems.
+1. Wouldn't it be more Python-like to provide the compiler with a set of known-to-be-static global name bindings ? A simple way of avoiding optimizations like these: def f(x, str=str): return str(x) + '!' would then be to have the compiler lookup "str" in the globals() passed to it and assign the found value to the constants of the function, provided that "str" appears in the list of known-to-be-static global name bindings (perhaps as optional addition parameter to compile() with some reasonable default in sys.staticsymbols). -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
On Thu, 2 Nov 2000, Guido van Rossum wrote:
I think Python 3000 ought to use totally static scoping. That will make it possible to do optimize code using built-in names!
[Moshe]
Isn't that another way of saying you want the builtin names to be part of the language definition? Part of today's method advantages is that new builtins can be added without any problems.
The built-in names have always been part of the language definition in my mind. The way they are implemented doesn't reflect this, but that's just an implementation detail. How would you like it if something claimed to be Python but didn't support len()? Or map()? That doesn't mean you can't add new built-ins, and I don't think that the new implementation will prevent that -- but it *will* assume that you don't mess with the definitions of the existing built-ins. Of course you still will be able to define functions whose name overrides a built-in -- in that case the compiler can see that you're doing that (because it knows the scope rules and can see what you are doing). But you won't be able to confuse someone else's module by secretly sticking a replacement built-in into their module's __dict__. --Guido van Rossum (home page: http://www.python.org/~guido/)
"GvR" == Guido van Rossum
writes:
GvR> The built-in names have always been part of the language GvR> definition in my mind. The way they are implemented doesn't GvR> reflect this, but that's just an implementation detail. How GvR> would you like it if something claimed to be Python but GvR> didn't support len()? Or map()? GvR> That doesn't mean you can't add new built-ins, and I don't GvR> think that the new implementation will prevent that -- but it GvR> *will* assume that you don't mess with the definitions of the GvR> existing built-ins. GvR> Of course you still will be able to define functions whose GvR> name overrides a built-in -- in that case the compiler can GvR> see that you're doing that (because it knows the scope rules GvR> and can see what you are doing). But you won't be able to GvR> confuse someone else's module by secretly sticking a GvR> replacement built-in into their module's __dict__. I'm a little confused. I've occasionally done the following within an application: ----------driver.py # need to override built-in open() to do extra debugging def debuggin_open(filename, mode, bufsize): # ... if EXTRA_DEBUGGING: import __builtin__.__dict__['open'] = debugging_open -------------------- snip snip -------------------- Would this be illegal? Would other modules in my application (even if imported from the standard library!) automatically get debugging_open() for open() like they do now? -Barry
"GvR" == Guido van Rossum
writes:
GvR> Of course you still will be able to define functions whose GvR> name overrides a built-in -- in that case the compiler can GvR> see that you're doing that (because it knows the scope rules GvR> and can see what you are doing). But you won't be able to GvR> confuse someone else's module by secretly sticking a GvR> replacement built-in into their module's __dict__.
[A Craven Dog overrides builtin open...]
Would this be illegal? Would other modules in my application (even if imported from the standard library!) automatically get debugging_open() for open() like they do now?
I have always found it very elegant & powerful that keyword "import" calls builtin __import__, and that well-behaved C goes through the same hook. In a similar vein, I have for quite awhile wanted to experiment with mountable virtual file systems so that I can "mount" a (for example) MetaKit database at /cheesewhiz and other modules in my app, when navigating into /cheesewhiz will, unbeknownst to them, be reading from the MetaKit DB (these ideas leaked from tcl-land by Jean-Claude). I most definitely appreciate that these facilities are dangerous and this type of stuff tends to be abused gratuitously (eg, most import hacks), but disallowing them might be considered a gratuitous limitation (heck - Barry knows how to bypass the governor on every cheezewhizzer ever manufactured). - Gordon
[Barry]
I'm a little confused. I've occasionally done the following within an application:
----------driver.py # need to override built-in open() to do extra debugging def debuggin_open(filename, mode, bufsize): # ... if EXTRA_DEBUGGING: import __builtin__.__dict__['open'] = debugging_open -------------------- snip snip --------------------
Would this be illegal? Would other modules in my application (even if imported from the standard library!) automatically get debugging_open() for open() like they do now?
That's up for discussion. Note that the open() function is special in this respect -- I don't see you doing the same to range() or hash(). If this is deemed a useful feature (for open()), we can make a rule about which built-ins you cannot override like this and which ones you can. --Guido van Rossum (home page: http://www.python.org/~guido/)
"GvR" == Guido van Rossum
writes:
GvR> That's up for discussion. Note that the open() function is GvR> special in this respect -- I don't see you doing the same to GvR> range() or hash(). Me neither, but special casing overridability seems like a fragile hack. GvR> If this is deemed a useful feature (for open()), we can make GvR> a rule about which built-ins you cannot override like this GvR> and which ones you can. Hmm, maybe we need __open__() and an open-hook? ;) -Barry
On Fri, Nov 03, 2000 at 10:36:22AM -0500, Guido van Rossum wrote:
[Barry]
Would this [replacing builtins] be illegal?
That's up for discussion. Note that the open() function is special in this respect -- I don't see you doing the same to range() or hash().
Eep. I don't care much about scoping, so I'm not going to say much about
that, but I certainly do care about Python's flexibility. One of the great
things is that there are so little special cases, that (nearly) everything
is delightfully consistent. Being able to override open() but not hash() or
range() sounds directly contrary to that, at least to me. Being able to
change __builtins__ *just like any other dict* strikes me as terribly
Pythonic, though I realize this is probably contrary to Guido's view of
Python :-) It also makes for instance 'exec' and 'eval' terribly obvious.
I understand where the wish to restrict replacement and shadowing of
builtins comes from, but I'm hoping here that this is going to be
'optional', like Perl's -w and 'use strict'. Defaulting to maximum
simplicity and maximum flexibility (in that order:) but with optional
warnings (when shadowing/modifying a builtin) and optimizations (using
constants for builtins, when not modifying or shadowing them, for instance.)
Just-my-fl.0,04-ly y'rs,
--
Thomas Wouters
Guido> If this is deemed a useful feature (for open()), we can make a Guido> rule about which built-ins you cannot override like this and Guido> which ones you can. I thought we were all adults... For Py3k I think it should be sufficient to define the semantics of the builtin functions so that if people want to override them they can, but that overriding them in incompatible ways is likely to create some problems. (They might have to run with a "no optimize" flag to keep the compiler from assuming semantics, for instance.) I see no particular reason to remove the current behavior unless there are clear instances where something important is not going to work properly. Modifying builtins seems to me to be akin to linking a C program with a different version of malloc. As long as the semantics of the new functions remain the same as the definition, everyone's happy. You can have malloc leave a logfile behind or keep histograms of allocation sizes. If someone links in a malloc library that only returns a pointer to a region that's only half the requested size though, you're likely to run into problems. Skip
[Making builtins static...] What about the idea to have the compiler make the decision whether a global symbol may be considered static based on a dictionary ? In non-optimizing mode this dictionary would be empty, with -O it would include all builtins which should never be overloaded and with -OO even ones which can be overloaded such as open() in addition to some standard modules which are known to only contain static symbols. Perhaps this needs some additional help of the "define" statement we planned as dynamic compiler interface ?! ... define static_symbols = * # all globals in this module define static_symbols = func1, func2 # only these two etc. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
Guido> If this is deemed a useful feature (for open()), we can make a Guido> rule about which built-ins you cannot override like this and Guido> which ones you can.
[Skip]
I thought we were all adults...
And consenting as well... :-)
For Py3k I think it should be sufficient to define the semantics of the builtin functions so that if people want to override them they can, but that overriding them in incompatible ways is likely to create some problems. (They might have to run with a "no optimize" flag to keep the compiler from assuming semantics, for instance.) I see no particular reason to remove the current behavior unless there are clear instances where something important is not going to work properly.
Modifying builtins seems to me to be akin to linking a C program with a different version of malloc. As long as the semantics of the new functions remain the same as the definition, everyone's happy. You can have malloc leave a logfile behind or keep histograms of allocation sizes. If someone links in a malloc library that only returns a pointer to a region that's only half the requested size though, you're likely to run into problems.
Actually, the C standard specifically says you are *not* allowed to override standard library functions like malloc(). I'm thinking of the example of the rules in Fortran for intrinsic functions (Fortran's name for built-ins). Based on what Tim has told me, I believe that Fortran by default assumes that you're not doing anything funky with intrinsics (like sin, cos, tan) it can use a shortcut, e.g. inline them. But there are also real functions by these names in the Fortran standard library, and you can call those by declaring e.g. "external function sin". (There may also be an explicit way to say that you're happy with the intrinsic one.) I believe that when you use the external variant, they may be overridden by the user. I'm thinking of something similar here for Python. If the bytecode compiler knows that the builtins are vanilla, it can generate better (== more efficient) code for e.g. for i in range(10): ... Ditto for expressions like len(x) -- the len() operation is typically so fast that the cost is dominated by the two dict lookup operations (first in globals(), then in __builtins__). Why am I interested in this? People interested in speed routinely use hacks that copy a built-in function into a local variable so that they don't have dictlookups in their inner loop; it's really silly to have to do this, and if certain built-ins were recognized by the compiler it wouldn't be necessary. There are other cases where this is not so easy without much more analysis; but the built-ins (to me) seem low-hanging fruit. (Search the archives for that term, I've used it before in this context.) I assume that it's *really* unlikely that there are people patching the __builtin__ module to replace the functions that are good inline candidates (range, len, id, hash and so on). So I'm interesting in complicating the rules here. I'd be happy to make an explicit list of those builtins that should not be messed with, as part of the language definition. Program that *do* mess with these have undefined semantics. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido> If this is deemed a useful feature (for open()), we can make a Guido> rule about which built-ins you cannot override like this and Guido> which ones you can.
I think I would be satisfied with just those builtins which involve interfaces to the external world. Where Java allows such deviance, they tend to provide an API whereby you can supply or register a factory to override or extend the default behavior. In principle this seems less hackish than stomping on builtins; in practice I doubt it makes much difference ;-).
[Skip]
I thought we were all adults...
Snort. One of my kids will be voting on Tuesday, but I *still* don't know what I want to be when I grow up. - Gordon
Guido:
I'd be happy to make an explicit list of those builtins that should not be messed with
There's a precedent for this in Scheme, which has a notion of "integrable procedures". As for the rest, with static scoping it will be possible to make access to builtins just as efficient as locals, while still allowing them to be rebound, so there's no reason why __builtin__.__dict__.open = foo can't continue to work, if so desired. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
Guido:
I'd be happy to make an explicit list of those builtins that should not be messed with
[Greg Ewing]
There's a precedent for this in Scheme, which has a notion of "integrable procedures".
Good!
As for the rest, with static scoping it will be possible to make access to builtins just as efficient as locals, while still allowing them to be rebound, so there's no reason why __builtin__.__dict__.open = foo can't continue to work, if so desired.
I'm not sure what you mean. With integrable procedures (whatever they may be :-) I believe this is possible. Without them, the lookup in globals() can be skipped for builtins, but a local is accessed with *zero* dict lookups -- how would you do this while still supporting __builtin__.__dict__.open = foo? have "hookable" dictionaries? (Those would solve a bunch of problems, but are not under consideration at the moment.) --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido:
the lookup in globals() can be skipped for builtins, but a local is accessed with *zero* dict lookups -- how would you do this while still supporting __builtin__.__dict__.open = foo? have "hookable" dictionaries?
With fully static scoping, I envisage that all three kinds of scope (local, module and builtin) would be implemented in essentially the same way, i.e. as arrays indexed by integers. That being the case, all you need to do is arrange for the __builtin__ module and the global scope to be one and the same thing, and __builtin__.open = foo will work just fine (assuming open() isn't one of the special inlinable functions). Getting __builtin__.__dict__['open'] = foo to work as well may require some kind of special dictionary-like object. But then you're going to need that anyway if you want to continue to support accessing module namespaces as if they are dictionaries. Whether it's worth continuing to support that in Py3k is something that can be debated separately.
integrable procedures (whatever they may be :-)
In the Revised^n Report, some builtin procedures are declared to be "integrable", meaning that the compiler is allowed to assume that they have their usual definitions and optimise accordingly. (This is quite important in Scheme, even more so than in Python, when you consider that almost every operation in Scheme, including '+', is a procedure call!) Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
[Jeremy Hylton]
... There are few languages that use dynamic scoping for normal name resolution. Many early Lisp implementations did, but I think all the modern ones use lexical scoping instead.
I believe all early Lisps were dynamically scoped. Scheme changed that, and Common Lisp followed. Many interpreted languages *start* life with dynamic scoping because it's easy to hack together, but that never lasts. REBOL went thru this a couple years ago, switching entirely from dynamic to lexical before its first public release, and breaking most existing programs in the process. Perl also started with dynamic scoping, but, in Perl-like fashion, Perl5 *added* lexical scoping on top of dynamic ("local" vars use dynamic scoping; "my" vars lexical; and all Perl5 gotcha guides stridently recommend never using "local" anymore). Here's some Perl5: $i = 1; sub f { local($i) = 2; &g(); } sub g { return $i; } print "i at start is $i\n"; print "i in g called directly is ", &g(), "\n"; print "i in g called indirectly via f is ", &f(), "\n"; print "i at end is $i\n"; Here's what it prints: i at start is 1 i in g called directly is 1 i in g called indirectly via f is 2 i at end is 1
It is hard to write modular code using dynamic scope, because the behavior of a function with free variables can not be determined by the module that defines it.
As shown above; dynamic scoping is a nightmare even in the absence of nested functions.
Not saying it isn't useful, just that it makes it much harder to reason about how a particular modular or function works in isolation from the rest of the system.
People who spend too much time writing meta-systems <wink> overestimate its usefulness. Most programmers who need this kind of effect would be much better off explicitly passing a dict of name->value mappings explicitly manipulated, or simply passing the values of interest (if I want a function that sucks the value of "i" out of its caller, what clearer way than for the caller to pass i in the arglist?! dynamic scoping is much subtler, of course -- it sucks the value of "i" out of *whatever* function up the call chain happened to define an i "most recently"). Lexical scoping is much tamer, and, indeed, Python is one of the few modern languages that doesn't support it. Last time Guido and I hand-wrung over this, that was the chief reason to implement it: newcomers are endlessly surprised that when they write functions that visually nest, their scopes nevertheless don't nest. There's certainly nothing novel or unexpected about lexical scoping anymore. The problem unique to Python is its rule for determining which vrbls are local; in Lisps, REBOL and Perl, you have to explicitly name all vrbls local to a given scope, which renders "how do you rebind a non-local name?" a non-issue. has-the-feel-of-inevitability-ly y'rs - tim
Jeremy Hylton
There are few languages that use dynamic scoping for normal name resolution. Many early Lisp implementations did, but I think all the modern ones use lexical scoping instead. It is hard to write modular code using dynamic scope, because the behavior of a function with free variables can not be determined by the module that defines it.
Correct. Based on my LISP experience, I would be strongly opposed to dymamic scoping. That path has been tried and found wanting. -- <a href="http://www.tuxedo.org/~esr/">Eric S. Raymond</a> "Guard with jealous attention the public liberty. Suspect every one who approaches that jewel. Unfortunately, nothing will preserve it but downright force. Whenever you give up that force, you are inevitably ruined." -- Patrick Henry, speech of June 5 1788
On Thu, 2 Nov 2000, Jeremy Hylton wrote:
There are few languages that use dynamic scoping for normal name resolution. Many early Lisp implementations did, but I think all the modern ones use lexical scoping instead. It is hard to write modular code using dynamic scope, because the behavior of a function with free variables can not be determined by the module that defines it. Not saying it isn't useful, just that it makes it much harder to reason about how a particular modular or function works in isolation from the rest of the system.
There's a seminal steele and sussman paper, _The Art of the Interpreter, or the Modularity Complex (Parts Zero, One, and Two)_, which, among other things, introduces closures and describes why dynamic scoping is evil. (The paper sketches out an elementary, lexically scoped lisp interpreter which i think is the precursor to scheme.) It's been over 20 years since i read it, but i recall something known as the funarg problem, where dynamic scoping + recursion (maybe, + functions as first class objects) inherently breaks certain parameter passing arrangements. If you're interested, see: ftp://publications.ai.mit.edu/ai-publications/0-499/AIM-453.ps (I'm working at home today, so can't even glance at it to refresh my memory, and oughtn't at the moment, anyway.) In any case, dynamic scoping breaks locality of reference for just about any definition of "locality". With lexical scoping, you can trace through the text of the program to track down implicit references. Even with acquisition, the containment connections are made explicitly (though it seems magic with web programming, because the zope publisher applies .__of__() for you), and can be nailed down by introspection on the objects (.aq_base, .aq_parent, .aq_inner...). With dynamic scoping, variable references go up the stack to the context of *any* intervening call frame, with no way for the programmer to know where it's being resolved. In practice, even without the funarg breakage, it doesn't scale up - it's too promiscuous a state-communication mechanism. You can work around it - a lot of emacs lisp code does so - but it's a terrible burden, not a benefit. Ken
On Tue, 14 Nov 2000, Ken Manheimer wrote:
In any case, dynamic scoping breaks locality of reference for just about any definition of "locality".
(Huh - i realized that "locality of reference" isn't the right phrase, not sure what is. I was trying to refer to something along the lines of modularity - basically, the virtue of lexical scoping is that you can, eventually, trace the connections by reading the program text. Dynamic scoping is not so well behaved - any routine that calls your routine may contain a variable whose value you're expecting to get from elsewhere. It's too promiscuous...) Ken
"MZ" == Moshe Zadka
writes:
MZ> If MAL means dynamic scoping (which I understood he does), MZ> then this simply means: MZ> when looking for a variable "foo", you first search for it in MZ> the local namespace. If not there, the *caller's* namespace, MZ> and so on. In the end, the caller is the __main__ module, and MZ> if not found there, it is a NameError. This is how Emacs Lisp behaves, and it's used all the time in ELisp programs. On the one hand it's quite convenient for customizing the behavior of functions. On the other hand, it can make documenting the interface of functions quite difficult because all those dynamically scoped variables are now part of the function's API. It's interesting to note that many ELispers really hate dynamic scoping and pine for a move toward lexical scoping. I'm not one of them. I'm not as concerned about "fixing" nested functions because I hardly ever use them, and rarely see them much in Python code. Fixing lambdas would be nice, but since Guido considers lambdas themselves a mistake, and given that lamda use /can/ be a performance hit in some situations, does it make sense to change something as fundamental as Python's scoping rules to fix this eddy of the language? -Barry
[Barry, on dynamic scopes]
This is how Emacs Lisp behaves, and it's used all the time in ELisp programs. On the one hand it's quite convenient for customizing the behavior of functions. On the other hand, it can make documenting the interface of functions quite difficult because all those dynamically scoped variables are now part of the function's API.
It's interesting to note that many ELispers really hate dynamic scoping and pine for a move toward lexical scoping. I'm not one of them.
I don't care what you pine for in ELisp, but for Python this would be a bad idea.
I'm not as concerned about "fixing" nested functions because I hardly ever use them, and rarely see them much in Python code. Fixing lambdas would be nice, but since Guido considers lambdas themselves a mistake, and given that lamda use /can/ be a performance hit in some situations, does it make sense to change something as fundamental as Python's scoping rules to fix this eddy of the language?
I referred to this in our group meeting as "fixing lambda" because that's where others seem to need it most often. But it is a real problem that exists for all nested functions. So let me rephrase that: "fixing nested function definitions" is useful, if it can be done without leaking memory, without breaking too much existing code, and without slowing down code that doesn't use the feature. --Guido van Rossum (home page: http://www.python.org/~guido/)
It may not look serious, but changing the Python lookup scheme is, since many inspection tools rely and reimplement exactly that scheme. With nested scopes, there would be next to no way to emulate the lookups using these tools.
So fix the tools.
To be honest, I don't think static nested scopes buy us all that much. You can do the same now, by using keyword arguments which isn't all that nice, but works great and makes the scope clearly visible.
Yes. It's a hack that gets employed over and over. And it has certain problems. We added 'import as' to get rid of a common practice that was perceived unclean. Maybe we should support nested scopes to get rid of another unclean common practice? I'm not saying that we definitely should add this to 2.1 (there's enough on our plate already) but we should at least consider it, and now that we have cycle GC, the major argument against it (that it causes cycles) is gone... --Guido van Rossum (home page: http://www.python.org/~guido/)
On Wed, 1 Nov 2000, Guido van Rossum wrote:
I'm not saying that we definitely should add this to 2.1 (there's enough on our plate already) but we should at least consider it, and now that we have cycle GC, the major argument against it (that it causes cycles) is gone...
This is the perfect moment to ask: what do we have on our plates for 2.1?
Shouldn't we have a list of goals for it or something? As a first-order
approximation, what PEPs are expected to be included? And, most the
conspiracy-theory question, what are Digital Creations' goals for Python?
We now return you to our regularily scheduled bug fixing.
--
Moshe Zadka
On Thu, Nov 02, 2000 at 05:29:55PM +0200, Moshe Zadka wrote:
Shouldn't we have a list of goals for [Python 2.1] or something? As a first-order approximation, what PEPs are expected to be included? And,
Stuff I personally want to get done: * Finish PEP 222, "Web Programming Improvements" and implement whatever emerges from it. * Write a PEP on using Distutils to build the modules that come with Python, and implement it if accepted. * Work on something CPAN-like. This may or may not have repercussions for the core; I don't know. --amk
[Andrew]
Stuff I personally want to get done: * Finish PEP 222, "Web Programming Improvements" and implement whatever emerges from it.
I just skimmed PEP 222. I agree that the classes defined by cgi.py are unnecessarily arcane. I wonder if it isn't better to just start over rather than trying to add yet another new class to the already top-heavy CGI module??? Regarding file uploads: you *seem* to be proposing that uploaded files should be loaded into memory only. I've got complaints from people who are uploading 10 Mb files and don't appreciate their process growing by that much. Perhaps there should be more options, so that the caller can control the disposition of uploads more carefully? What's wrong with subclassing? Maybe two example subclasses should be provided? Regarding templating -- what's wrong with HTMLgen as a starting point? Just that it's too big? I've never used it myself, but I've always been impressed with its appearance. :-) --Guido van Rossum (home page: http://www.python.org/~guido/)
On Thu, Nov 02, 2000 at 05:06:16AM -0500, Guido van Rossum wrote:
I just skimmed PEP 222. I agree that the classes defined by cgi.py are unnecessarily arcane. I wonder if it isn't better to just start over rather than trying to add yet another new class to the already top-heavy CGI module???
I've wondered about that, too; writing a neat request class that wraps up field values, cookies, and environment variables, and provides convenience functions for re-creating the current URL, Something like the request classes in Zope and Webware.
Regarding file uploads: you *seem* to be proposing that uploaded files should be loaded into memory only. I've got complaints from people
Mrr...? The only file upload reference simply says you shouldn't have to subclass in order to use them; it's not implying files have to be read into memory. (We have to deal with whackingly large mask layout files at work, after all.)
Regarding templating -- what's wrong with HTMLgen as a starting point? Just that it's too big? I've never used it myself, but I've always been impressed with its appearance. :-)
I, personally, am against including templating, but it was suggested. I'm against it because there are too many solutions with different tradeoffs. Do you want a simple regex search-and-replace, constructing HTML pages as Python objects, or a full-blown minilanguage? HTML/XML-compatibile syntax, ASP-compatible syntax, Python-compatible syntax? Much better just to move templating into the "Rejected" category and give the above rationale. --amk
Regarding file uploads: you *seem* to be proposing that uploaded files should be loaded into memory only. I've got complaints from people
Mrr...? The only file upload reference simply says you shouldn't have to subclass in order to use them; it's not implying files have to be read into memory. (We have to deal with whackingly large mask layout files at work, after all.)
Mrrauw...? Do you really have to subclass? I thought it just says that you can subclass if you're not happy with the given make_file() implementation?
Regarding templating -- what's wrong with HTMLgen as a starting point? Just that it's too big? I've never used it myself, but I've always been impressed with its appearance. :-)
I, personally, am against including templating, but it was suggested. I'm against it because there are too many solutions with different tradeoffs. Do you want a simple regex search-and-replace, constructing HTML pages as Python objects, or a full-blown minilanguage? HTML/XML-compatibile syntax, ASP-compatible syntax, Python-compatible syntax? Much better just to move templating into the "Rejected" category and give the above rationale.
Sure -- I'm perfectly happy with ad-hoc templating solutions myself (see my FAQ wizard). I've also heard Tom Christiansen complain that Perl is slower to start up than Python for CGI work -- because the templating classes are so big, and are all loaded at startup! I do see a use for a helper or helpers to creates tables though -- tables are notoriously tag-intensive and hard to get right. --Guido van Rossum (home page: http://www.python.org/~guido/)
Since today has been busy on the meta-sig, I wonder if we should create a web-sig to thrash out these issues. Jeremy
On Thu, Nov 02, 2000 at 05:07:37PM -0500, Jeremy Hylton wrote:
Since today has been busy on the meta-sig, I wonder if we should create a web-sig to thrash out these issues.
There's already a python-web-modules@egroups.com list, for authors of Python Web frameworks; fairly on-topic for that list, I should think. --amk
On Thu, Nov 02, 2000 at 10:34:42AM -0500, Andrew Kuchling wrote:
* Work on something CPAN-like. This may or may not have repercussions for the core; I don't know.
Probably not, though perhaps a new module would be nice. As for the
CPAN-like thing, I really got a kick out of Greg S's WebDAV session on
Apachecon, and I think it would be suited extremely well as the transmission
protocol for SPAM (or however you want to call the Python CPAN ;). You can
do the uploading, downloading and searching for modules using WebDAV without
too much pain, and there's excellent WebDAV support for Apache ;)
Is anyone working on something like this, or even thinking about it ? I'm
not deep enough into distutils to join that SIG, but I definately would join
a CPyAN SIG ;)
--
Thomas Wouters
[Andrew]
* Work on something CPAN-like. This may or may not have repercussions for the core; I don't know.
[Thomas]
Probably not, though perhaps a new module would be nice. As for the CPAN-like thing, I really got a kick out of Greg S's WebDAV session on Apachecon, and I think it would be suited extremely well as the transmission protocol for SPAM (or however you want to call the Python CPAN ;). You can do the uploading, downloading and searching for modules using WebDAV without too much pain, and there's excellent WebDAV support for Apache ;)
Is anyone working on something like this, or even thinking about it ? I'm not deep enough into distutils to join that SIG, but I definately would join a CPyAN SIG ;)
This is a nice thing to have, but I don't see why it should be tied to the 2.1 effort. Let's not couple projects that can be carried out independently! --Guido van Rossum (home page: http://www.python.org/~guido/)
On Thu, 2 Nov 2000, Thomas Wouters wrote:
Is anyone working on something like this, or even thinking about it ? I'm not deep enough into distutils to join that SIG, but I definately would join a CPyAN SIG ;)
Cries for this sig have been already made in c.l.py.
I'm moving this discussion to meta-sig. Please discuss it there.
I'm willing to champion it, but I'll defer if Andrew or Greg want
to do it.
--
Moshe Zadka
Andrew Kuchling wrote:
On Thu, Nov 02, 2000 at 05:29:55PM +0200, Moshe Zadka wrote:
Shouldn't we have a list of goals for [Python 2.1] or something? As a first-order approximation, what PEPs are expected to be included? And,
Stuff I personally want to get done: * Finish PEP 222, "Web Programming Improvements" and implement whatever emerges from it.
* Write a PEP on using Distutils to build the modules that come with Python, and implement it if accepted.
* Work on something CPAN-like. This may or may not have repercussions for the core; I don't know.
Most important for 2.1 are probably: 1. new C level coercion scheme 2. rich comparisons 3. making the std lib Unicode compatible -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
M.-A. Lemburg
Most important for 2.1 are probably:
1. new C level coercion scheme 2. rich comparisons 3. making the std lib Unicode compatible
I'd certainly like to see rich comparisons go in. I have a "Set" class all ready for addition to the standard library except that it's waiting on this feature in order to do partial ordering properly. -- <a href="http://www.tuxedo.org/~esr/">Eric S. Raymond</a> Every election is a sort of advance auction sale of stolen goods. -- H.L. Mencken
Guido van Rossum wrote:
It may not look serious, but changing the Python lookup scheme is, since many inspection tools rely and reimplement exactly that scheme. With nested scopes, there would be next to no way to emulate the lookups using these tools.
So fix the tools.
Eek. Are you proposing to break all the Python IDE that are just appearing out there ?
To be honest, I don't think static nested scopes buy us all that much. You can do the same now, by using keyword arguments which isn't all that nice, but works great and makes the scope clearly visible.
Yes. It's a hack that gets employed over and over. And it has certain problems. We added 'import as' to get rid of a common practice that was perceived unclean. Maybe we should support nested scopes to get rid of another unclean common practice?
I think the common practice mainly comes from the fact, that by making globals locals which can benefit from LOAD_FAST you get a noticable performance boost. So the "right" solution to these weird looking hacks would be to come up with a smart way by which the Python compiler itself can do the localizing. Nested scopes won't help eliminating the current keyword practice.
I'm not saying that we definitely should add this to 2.1 (there's enough on our plate already) but we should at least consider it, and now that we have cycle GC, the major argument against it (that it causes cycles) is gone...
Hmm, so far the only argument for changing Python lookups was to allow writing lambdas without keyword hacks. Does this really warrant breaking code ? What other advantages would statically nested scopes have ? -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
[MAL]
It may not look serious, but changing the Python lookup scheme is, since many inspection tools rely and reimplement exactly that scheme. With nested scopes, there would be next to no way to emulate the lookups using these tools.
[GvR]
So fix the tools.
[MAL]
Eek. Are you proposing to break all the Python IDE that are just appearing out there ?
Yes. If a tool does variable scope analysis, it should be prepared for changes in the rules. Otherwise we might as well have refused the syntax changes in 2.0 because they required changes to tools!
To be honest, I don't think static nested scopes buy us all that much. You can do the same now, by using keyword arguments which isn't all that nice, but works great and makes the scope clearly visible.
Yes. It's a hack that gets employed over and over. And it has certain problems. We added 'import as' to get rid of a common practice that was perceived unclean. Maybe we should support nested scopes to get rid of another unclean common practice?
I think the common practice mainly comes from the fact, that by making globals locals which can benefit from LOAD_FAST you get a noticable performance boost.
So the "right" solution to these weird looking hacks would be to come up with a smart way by which the Python compiler itself can do the localizing.
Can you elaborate? I dun't understand what you are proposing here.
Nested scopes won't help eliminating the current keyword practice.
Why not? I'd say that def create_adder(n): def adder(x, n=n): return x+n return adder is a hack and that nested scopes can fix this by allowing you to write def create_adder(n): def adder(x): return x+n return adder like one would expect. (Don't tell me that it isn't a FAQ why this doesn't work!)
I'm not saying that we definitely should add this to 2.1 (there's enough on our plate already) but we should at least consider it, and now that we have cycle GC, the major argument against it (that it causes cycles) is gone...
Hmm, so far the only argument for changing Python lookups was to allow writing lambdas without keyword hacks. Does this really warrant breaking code ?
What other advantages would statically nested scopes have ?
Doing what's proper. Nested scopes are not a bad idea. They weren't implemented because they were hard to get right (perhaps impossible without creating cycles), and I was okay with that because I didn't like them; but I've been convinced that examples like the second create_adder() above should reall work. Just like float+int works (in Python 0.1, this was a type-error -- you had to case the int arg to a float to get a float result). --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum wrote:
To be honest, I don't think static nested scopes buy us all that much. You can do the same now, by using keyword arguments which isn't all that nice, but works great and makes the scope clearly visible.
Yes. It's a hack that gets employed over and over. And it has certain problems. We added 'import as' to get rid of a common practice that was perceived unclean. Maybe we should support nested scopes to get rid of another unclean common practice?
I think the common practice mainly comes from the fact, that by making globals locals which can benefit from LOAD_FAST you get a noticable performance boost.
So the "right" solution to these weird looking hacks would be to come up with a smart way by which the Python compiler itself can do the localizing.
Can you elaborate? I dun't understand what you are proposing here.
See my other post... I would like to have the compiler do the localization for me in case it sees a global which has been "defined" static.
Nested scopes won't help eliminating the current keyword practice.
Why not? I'd say that
def create_adder(n): def adder(x, n=n): return x+n return adder
is a hack and that nested scopes can fix this by allowing you to write
def create_adder(n): def adder(x): return x+n return adder
like one would expect. (Don't tell me that it isn't a FAQ why this doesn't work!)
I know... still, I consider function definitions within a function bad style. Maybe just me, though ;-)
I'm not saying that we definitely should add this to 2.1 (there's enough on our plate already) but we should at least consider it, and now that we have cycle GC, the major argument against it (that it causes cycles) is gone...
Hmm, so far the only argument for changing Python lookups was to allow writing lambdas without keyword hacks. Does this really warrant breaking code ?
What other advantages would statically nested scopes have ?
Doing what's proper. Nested scopes are not a bad idea. They weren't implemented because they were hard to get right (perhaps impossible without creating cycles), and I was okay with that because I didn't like them; but I've been convinced that examples like the second create_adder() above should reall work. Just like float+int works (in Python 0.1, this was a type-error -- you had to case the int arg to a float to get a float result).
Ok, but how does nested scoping mix with class definitions and global lookups then ? -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
[MAL]
... Hmm, so far the only argument for changing Python lookups was to allow writing lambdas without keyword hacks. Does this really warrant breaking code ?
*Some* amount of code, sure. Hard to quantify, but hard to believe there's much code at risk.
What other advantages would statically nested scopes have ?
Pythonic obviousness. Virtually everyone coming to Python from other languages *expects* visually nested functions to work this way. That they don't today is a very frequent source of surprises and complaints.
Guido van Rossum
To be honest, I don't think static nested scopes buy us all that much. You can do the same now, by using keyword arguments which isn't all that nice, but works great and makes the scope clearly visible.
Yes. It's a hack that gets employed over and over. And it has certain problems. We added 'import as' to get rid of a common practice that was perceived unclean. Maybe we should support nested scopes to get rid of another unclean common practice?
I'm not saying that we definitely should add this to 2.1 (there's enough on our plate already) but we should at least consider it, and now that we have cycle GC, the major argument against it (that it causes cycles) is gone...
For whatever it's worth, I agree with both these arguments. -- <a href="http://www.tuxedo.org/~esr/">Eric S. Raymond</a> "The power to tax involves the power to destroy;...the power to destroy may defeat and render useless the power to create...." -- Chief Justice John Marshall, 1819.
"MAL" == M -A Lemburg
writes:
MAL> It may not look serious, but changing the Python lookup scheme MAL> is, since many inspection tools rely and reimplement exactly MAL> that scheme. With nested scopes, there would be next to no way MAL> to emulate the lookups using these tools. Can you say more about this issue? It sounds like it is worth discussing in the PEP, but I can't get a handle on exactly what the problem is. Any tool needs to implement or model Python's name resolution algorithm, call it algorithm A. If we change name resolution to use algorithm B, then the tools need to implement or model a new algorithm. I don't see where the impossibility of emulation comes in. Jeremy
Jeremy Hylton wrote:
"MAL" == M -A Lemburg
writes: MAL> It may not look serious, but changing the Python lookup scheme MAL> is, since many inspection tools rely and reimplement exactly MAL> that scheme. With nested scopes, there would be next to no way MAL> to emulate the lookups using these tools.
Can you say more about this issue? It sounds like it is worth discussing in the PEP, but I can't get a handle on exactly what the problem is. Any tool needs to implement or model Python's name resolution algorithm, call it algorithm A. If we change name resolution to use algorithm B, then the tools need to implement or model a new algorithm. I don't see where the impossibility of emulation comes in.
Well first you'd have to change all tools to use the new scheme (this includes debuggers, inspection tools, reflection kits, etc.). This certainly is not a smart thing to do since Python IDEs are just starting to appear -- you wouldn't want to break all those. What get's harder with the nested scheme is that you can no longer be certain that globals() reaches out to the module namespace. But this is needed by some lazy evaluation tools. Writing to globals() would not be defined anymore -- where should you bind the new variable ? Another problem is that there probably won't be a way to access all the different nesting levels on a per-level basis (could be that I'm missing something here, but debugging tools would need some sort of scope() builtin to access the different scopes). I'm not sure whether this is possible to do without some sort of link between the scopes. We currently don't need such links. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
Well first you'd have to change all tools to use the new scheme (this includes debuggers, inspection tools, reflection kits, etc.). This certainly is not a smart thing to do since Python IDEs are just starting to appear -- you wouldn't want to break all those.
I've seen a Komodo demo. Yes, it does this. But it's soooooooo far from being done that adding this wouldn't really slow them down much, I think. More likely, the toolmakers will have fun competing with each other to be the first to support this! :-)
What get's harder with the nested scheme is that you can no longer be certain that globals() reaches out to the module namespace. But this is needed by some lazy evaluation tools. Writing to globals() would not be defined anymore -- where should you bind the new variable ?
I think globals() should return the module's __dict__, and the global statement should cause the variable to reach directly into there. We'll need to introduce a new builtin to do a name *lookup* in the nested scopes.
Another problem is that there probably won't be a way to access all the different nesting levels on a per-level basis (could be that I'm missing something here, but debugging tools would need some sort of scope() builtin to access the different scopes). I'm not sure whether this is possible to do without some sort of link between the scopes. We currently don't need such links.
Correct. That link may have to be added to the frame object. --Guido van Rossum (home page: http://www.python.org/~guido/)
MAL> [pre-PEP] This will break code... Jeremy> How will this break code? Suppose you have x = 1 def f1(): x = 2 def inner(): print x inner() Today, calling f1() prints "1". After your proposed changes I suspect it would print "2". Skip
Thanks. I expect there is very little code that depends on this sort of behavior, since it is confusing to read. Many readers, particularly novices, could reasonably expect Python to print 2 now. As I explained to MAL, I think we would need to provide a code analysis tool that identified these problems. It's probably helpful to generate warning about this right now, since it's rather obfuscated. Jeremy
Jeremy Hylton
Seriously, I don't see how acquistion addresses the same issues at all.
My proposal for nested scopes was actually an acquisition-like mechanism. The idea was to avoid unbreakable cycles by deferring the creation of a closure from when the function is defined to when it is used. Guido rejected my implementation for various good reasons. It could be modified to overcome most of those objections, however. Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
[Jeremy Hylton]
... Guido once explained that his original reservation about nested scopes was a reaction to their overuse in Pascal. In large Pascal programs he was familiar with, block structure was overused as an organizing principle for the program, leading to hard-to-read code.
Note that this problem will be much worse in Python: in Pascal, you could always "look up" for the closest-containing func/proc that explicitly declares a referenced vrbl. In Python, you have to indirectly *deduce* which vrbls are local to a def, by searching the entire body for an appearance as a binding target. So you have to "look up" and "look down" from the reference point, and it's easy to miss a binding target. i = 6 def f(x): def g(): print i # ... # skip to the next page # ... for i in x: # ah, i *is* local to f, so this is what g sees pass g()
def bank_account(initial_balance): balance = [initial_balance] def deposit(amount): balance[0] = balance[0] + amount def withdraw(amount): balance[0] = balance[0] - amount return deposit, withdraw
Unfortunately for proponents, this is exactly the kind of SICP example that is much better done via a class. Not only is the closure version strained by comparison, but as is usual it manages to create a bank account with a write-only balance <0.9 wink>. def deposit(amount): global bank_account.balance balance += amount is one old suggested way to explicitly declare non-local names and the enclosing block to which they are local (and in analogy with current "global", mandatory if you want to rebind the non-local name, optional if you only want to reference it). There are subtleties here, but explicit is better than implicit, and the subtleties are only subtler if you refuse (like Scheme) to make the intent explicit. for-real-fun-think-about-"exec"-abuses-ly y'rs - tim
[Jeremy Hylton]
... Guido once explained that his original reservation about nested scopes was a reaction to their overuse in Pascal. In large Pascal programs he was familiar with, block structure was overused as an organizing principle for the program, leading to hard-to-read code.
[Tim Peters]
Note that this problem will be much worse in Python: in Pascal, you could always "look up" for the closest-containing func/proc that explicitly declares a referenced vrbl. In Python, you have to indirectly *deduce* which vrbls are local to a def, by searching the entire body for an appearance as a binding target. So you have to "look up" and "look down" from the reference point, and it's easy to miss a binding target.
This is a tool problem, and should be solved with good tools.
Of course, installing the corret tools in people's minds will require
some technological discoveries.
--
Moshe Zadka
On Thu, 2 Nov 2000, Moshe Zadka wrote:
[Tim Peters]
Note that this problem will be much worse in Python: in Pascal, you could always "look up" for the closest-containing func/proc that explicitly declares a referenced vrbl. In Python, you have to indirectly *deduce* which vrbls are local to a def, by searching the entire body for an appearance as a binding target. So you have to "look up" and "look down" from the reference point, and it's easy to miss a binding target.
This is a tool problem, and should be solved with good tools. Of course, installing the corret tools in people's minds will require some technological discoveries.
Bleck. Those tools are a crutch to deal with a poor language design / feature. And are those tools portable? Are they part of everybody's standard tool set? Will vi, emacs, and MS DevStudio all have those capabilities? Not a chance. Personally, I'll take Guido's point of view and say they are inherently hard to deal with; therefore, punt them. Cheers, -g -- Greg Stein, http://www.lyra.org/
"GS" == Greg Stein
writes:
GS> On Thu, 2 Nov 2000, Moshe Zadka wrote:
This is a tool problem, and should be solved with good tools. Of course, installing the corret tools in people's minds will require some technological discoveries.
GS> Bleck. Those tools are a crutch to deal with a poor language GS> design / feature. And are those tools portable? Are they part of GS> everybody's standard tool set? Will vi, emacs, and MS DevStudio GS> all have those capabilities? Are you saying that compilers are a crutch and we should get rid of them? I don't think you intend that, but this is a completely straightforward tool to build. It is needed only for backwards compatibility -- to identify scripts that depend on the changed behavior. There is no need for vi, emacs, or devstudio to understand what's going on. Jeremy
On Thu, 2 Nov 2000, Jeremy Hylton wrote:
"GS" == Greg Stein
writes: GS> On Thu, 2 Nov 2000, Moshe Zadka wrote:
This is a tool problem, and should be solved with good tools. Of course, installing the corret tools in people's minds will require some technological discoveries.
GS> Bleck. Those tools are a crutch to deal with a poor language GS> design / feature. And are those tools portable? Are they part of GS> everybody's standard tool set? Will vi, emacs, and MS DevStudio GS> all have those capabilities?
Are you saying that compilers are a crutch and we should get rid of them? I don't think you intend that, but this is a completely straightforward tool to build. It is needed only for backwards compatibility -- to identify scripts that depend on the changed behavior. There is no need for vi, emacs, or devstudio to understand what's going on.
you guys are talking about different things.
Jeremy is talking about a tool to warn against incompatible changes
Greg is talking about a tool to identify, for each variable, what scope
it belongs to.
as-usual-the-answer-is-"you're-both-right"-ly y'rs, Z.
--
Moshe Zadka
"MZ" == Moshe Zadka
writes:
MZ> you guys are talking about different things. Jeremy is MZ> talking about a tool to warn against incompatible changes Greg MZ> is talking about a tool to identify, for each variable, what MZ> scope it belongs to. And Greg's point is well taken, because it /will/ be harder to tell at a glance where a name is coming from, so programming tools will have to find ways to help with this. -Barry
"MZ" == Moshe Zadka
writes:
MZ> you guys are talking about different things. Jeremy is talking MZ> about a tool to warn against incompatible changes Greg is MZ> talking about a tool to identify, for each variable, what scope MZ> it belongs to. Not sure we're talking about different things. The compiler will need to determine the scope of each variable. It's a tool. If it implements the specifiction for name binding, other tools can too. Jeremy
"TP" == Tim Peters
writes:
TP> [Jeremy Hylton]
... Guido once explained that his original reservation about nested scopes was a reaction to their overuse in Pascal. In large Pascal programs he was familiar with, block structure was overused as an organizing principle for the program, leading to hard-to-read code.
TP> Note that this problem will be much worse in Python: in Pascal, TP> you could always "look up" for the closest-containing func/proc TP> that explicitly declares a referenced vrbl. In Python, you have TP> to indirectly *deduce* which vrbls are local to a def, by TP> searching the entire body for an appearance as a binding target. TP> So you have to "look up" and "look down" from the reference TP> point, and it's easy to miss a binding target. I agree that visual inspection is a tad harder, but I contend that existing programs that use the same name for a global variable and a local variable -- and intend for the global to be visible within a function nested in the local variable's region -- are confusing. It's too hard for a first-time reader of the code to figure out what is going on. Incidentally, I have yet to see an example of this problem occurring in anyone's code. All the examples seem a bit contrived. I wonder if anyone has an example in existing code. [My SICP example omitted] TP> Unfortunately for proponents, this is exactly the kind of SICP TP> example that is much better done via a class. Indeed, the PEP says exactly that: This kind of program is better done via a class. My intent was not to show a compelling use of mutable state. Instead it was to show that with read-only access, people could still modify values on enclosing scopes. The issue is whether the language allows the programmer to express this intent clearly or if she has to jump through some hoops to accomplish it. TP> Not only is the TP> closure version strained by comparison, but as is usual it TP> manages to create a bank account with a write-only balance <0.9 TP> wink>. TP> def deposit(amount): TP> global bank_account.balance balance += amount TP> is one old suggested way to explicitly declare non-local names TP> and the enclosing block to which they are local (and in analogy TP> with current "global", mandatory if you want to rebind the TP> non-local name, optional if you only want to reference it). TP> There are subtleties here, but explicit is better than implicit, TP> and the subtleties are only subtler if you refuse (like Scheme) TP> to make the intent explicit. I'm still not sure I like it, because it mixes local variables of a function with attribute access on objects. I'll add it to the discussion in the PEP (if Barry approves the PEP <wink>), though. Do you have any opinion on the subtleties? The two that immediately come to mind are: 1) whether the function's local are available as attributes anywhere or only in nested scopes and 2) whether you can create new local variable using this notation. Jeremy
[Jeremy and Tim argue about what to do about write access for variables at intermediate levels of nesting, neither local nor module-global.] I'll risk doing a pronouncement, even though I know that Jeremy (and maybe also Tim?) disagree. You don't need "write access" (in the sense of being able to assign) for variables at the intermediate scopes, so there is no need for a syntax to express this. Assignments are to local variables (normally) or to module-globals (when 'global' is used). Use references search for a local, then for a local of the containing function definition, then for a local in its container, and so forth, until it hits the module globals, and then finally it looks for a builtin. We can argue over which part of this is done statically and which part is done dynamically: currently, locals are done dynamically and everything else is done statically. --Guido van Rossum (home page: http://www.python.org/~guido/)
Don't know if you saw the discussion in the PEP or not. I made two arguments for being able to assign to variables bound in enclosing scopes. 1. Every other language that supports nested lexical scoping allows this. To the extent that programmers have seen these other languages, they will expect it to work. 2. It is possible to work around this limitation by using containers. If you want to have an integer that can be updated by nested functions, you wrap the interger in a list and make all assignments and references refer to list[0]. It would be unfortunate if programmers used this style, because it is obscure. I'd rather see the language provide a way to support this style of programming directly. Jeremy
Don't know if you saw the discussion in the PEP or not.
Sorry, I had no time. I have read it now, but it doesn't change my point of view.
I made two arguments for being able to assign to variables bound in enclosing scopes.
1. Every other language that supports nested lexical scoping allows this. To the extent that programmers have seen these other languages, they will expect it to work.
But we have a unique way of declaring variables, which makes the issues different. Your PEP wonders why I am against allowing assignment to intermediate levels. Here's my answer: all the syntaxes that have been proposed to spell this have problems. So let's not provide a way to spell it. I predict that it won't be a problem. If it becomes a problem, we can add a way to spell it later. I expect that the mechanism that will be used to find variables at intermediate levels can also be used to set them, so it won't affect that part of the implementation much.
2. It is possible to work around this limitation by using containers. If you want to have an integer that can be updated by nested functions, you wrap the interger in a list and make all assignments and references refer to list[0]. It would be unfortunate if programmers used this style, because it is obscure. I'd rather see the language provide a way to support this style of programming directly.
I don't expect that programmers will use this style. When they have this need, they will more likely use a class. --Guido van Rossum (home page: http://www.python.org/~guido/)
[Guido]
[Jeremy and Tim argue about what to do about write access for variables at intermediate levels of nesting, neither local nor module-global.]
I'll risk doing a pronouncement, even though I know that Jeremy (and maybe also Tim?) disagree.
You don't need "write access" (in the sense of being able to assign) for variables at the intermediate scopes, so there is no need for a syntax to express this.
I can live with that! Reference-only access to intermediate scopes would address 99% of current gripes. Of course, future gripes will shift to that there's no rebinding access. If we have to support that someday too, my preferred way of spelling it in Python requires explicit new syntax, so adding that later would not break anything.
Assignments are to local variables (normally) or to module-globals (when 'global' is used). Use references search for a local, then for a local of the containing function definition, then for a local in its container,
The Pascal standard coined "closest-containing scope" to describe this succinctly, and I recommend it for clarity and brevity.
and so forth, until it hits the module globals, and then finally it looks for a builtin.
We can argue over which part of this is done statically and which part is done dynamically: currently, locals are done dynamically and everything else is done statically.
I'm not sure what you're trying to say there, but to the extent that I think I grasp it, I believe it's backwards: locals are static today but everything else is dynamic (and not in the sense of "dynamic scoping", but in the operational sense of "requires runtime search to resolve non-local names", while local names are fully resolved at compile-time today (but in the absence of "exec" and "import *")). what's-in-a-name?-a-rose-in-any-other-scope-may-not-smell-as- sweet-ly y'rs - tim
[Guido]
[Jeremy and Tim argue about what to do about write access for variables at intermediate levels of nesting, neither local nor module-global.]
I'll risk doing a pronouncement, even though I know that Jeremy (and maybe also Tim?) disagree.
You don't need "write access" (in the sense of being able to assign) for variables at the intermediate scopes, so there is no need for a syntax to express this.
[Tim]
I can live with that! Reference-only access to intermediate scopes would address 99% of current gripes. Of course, future gripes will shift to that there's no rebinding access. If we have to support that someday too, my preferred way of spelling it in Python requires explicit new syntax, so adding that later would not break anything.
Exactly my point.
Assignments are to local variables (normally) or to module-globals (when 'global' is used). Use references search for a local, then for a local of the containing function definition, then for a local in its container,
The Pascal standard coined "closest-containing scope" to describe this succinctly, and I recommend it for clarity and brevity.
Thanks, that's a good term. And the principle is totally uncontroversial.
and so forth, until it hits the module globals, and then finally it looks for a builtin.
We can argue over which part of this is done statically and which part is done dynamically: currently, locals are done dynamically and everything else is done statically.
I'm not sure what you're trying to say there, but to the extent that I think I grasp it, I believe it's backwards: locals are static today but everything else is dynamic (and not in the sense of "dynamic scoping", but in the operational sense of "requires runtime search to resolve non-local names", while local names are fully resolved at compile-time today (but in the absence of "exec" and "import *")).
Oops, yes, I had it backwards. As I said elsewhere, in Python 3000 I'd like to do it all more statically. So perhaps we should look up nested locals based on static information too. Thus: x = "global-x" def foo(): if 0: x = "x-in-foo" def bar(): return x return bar print foo()() should raise UnboundLocalError, not print "global-x". --Guido van Rossum (home page: http://www.python.org/~guido/)
[Jeremy]
I agree that visual inspection is a tad harder, but I contend that existing programs that use the same name for a global variable and a local variable -- and intend for the global to be visible within a function nested in the local variable's region -- are confusing. It's too hard for a first-time reader of the code to figure out what is going on.
Incidentally, I have yet to see an example of this problem occurring in anyone's code. All the examples seem a bit contrived. I wonder if anyone has an example in existing code.
I wasn't the one making the "will break code" argument (I'm sure it will, but very little, and not at all for most people since most people never nest functions in Python today apart from default-abusing lambdas). Visual inspection stands on its own as a potential problem.
[My SICP example omitted]
TP> Unfortunately for proponents, this is exactly the kind of SICP TP> example that is much better done via a class.
Indeed, the PEP says exactly that: This kind of program is better done via a class. My intent was not to show a compelling use of mutable state. Instead it was to show that with read-only access, people could still modify values on enclosing scopes. The issue is whether the language allows the programmer to express this intent clearly or if she has to jump through some hoops to accomplish it.
Guido said "jump through some hoops", so I'm dropping it, but first noting that "the container" in *idiomatic* Python will most often be "self": def yadda(self, ...): def whatever(amount): self.balance += amount return whatever will work to rebind self.balance. I don't think Guido will ever be interested in supporting idiomatic Scheme.
TP> def deposit(amount): TP> global bank_account.balance TP> balance += amount
I'm still not sure I like it, because it mixes local variables of a function with attribute access on objects. I'll add it to the discussion in the PEP (if Barry approves the PEP <wink>), though.
Actually, no connection to attribute access was intended there: it was just a backward-compatible way to spell the pair (name of containing scope, name of vrbl in that scope). global back_account:balance or global balance from bank_account would do as well (and maybe better as they don't imply attribute access; but maybe worse as they don't bring to mind attribute access <wink>).
Do you have any opinion on the subtleties? The two that immediately come to mind are: 1) whether the function's local are available as attributes anywhere or only in nested scopes
I didn't intend attribute-like access at all (although we had earlier talked about that wrt JavaScript, I didn't have that in mind here).
and 2) whether you can create new local variable using this notation.
Yes, that's the primary one I was thinking about.
"Tim" == Tim Peters
writes:
Tim> [Jeremy] >> All the examples seem a bit contrived. I wonder if anyone has an >> example in existing code. Tim> I wasn't the one making the "will break code" argument ... Nor was I. Jeremy (I think) asked MAL how it could break code. I posted a (simple, but obviously contrived) example. I have no particular opinion on this subject. I was just trying to answer Jeremy's question. Skip
Looks like we need to rehash this thread at least enough to determine who is responsible for causing us to rehash it. MAL said it would break code. I asked how. Skip and Tim obliged with examples. I said their examples exhibited bad style; neither of them claimed they were good style. In the end, I observed that while it could break code in theory, I doubted it really would break much code. Furthermore, I believe that the code it will break is already obscure so we needn't worry about it. Jeremy
[Jeremy]
Looks like we need to rehash this thread at least enough to determine who is responsible for causing us to rehash it.
MAL said it would break code. I asked how. Skip and Tim obliged with examples. I said their examples exhibited bad style; neither of them claimed they were good style.
In the end, I observed that while it could break code in theory, I doubted it really would break much code. Furthermore, I believe that the code it will break is already obscure so we needn't worry about it.
That's quite enough rehashing. I don't think we'll have to worry about breakiung much code. --Guido van Rossum (home page: http://www.python.org/~guido/)
[Jeremy]
Looks like we need to rehash this thread at least enough to determine who is responsible for causing us to rehash it.
MAL said it would break code. I asked how. Skip and Tim obliged with examples. I said their examples exhibited bad style; neither of them claimed they were good style.
Ah! I wasn't trying to give an example of code that would break, but, ya, now that you mention it, it would. I was just giving an example of why visual inspection will be harder in Python than in Pascal. I expect that the kind of code I showed *will* be common: putting all the nested "helper functions" at the top of a function, just as was also done in Pascal. The killer difference is that in Pascal, the containing function's locals referenced by the nested helpers can be found declared at the top of the containing function; but in Python you'll have to search all over the place, and if you don't find a non-local var at once, you'll be left uneasy, wondering whether you missed a binding target, or whether the var is truly global, or what.
Jeremy Hylton wrote:
Looks like we need to rehash this thread at least enough to determine who is responsible for causing us to rehash it.
MAL said it would break code. I asked how. Skip and Tim obliged with examples. I said their examples exhibited bad style; neither of them claimed they were good style.
In the end, I observed that while it could break code in theory, I doubted it really would break much code. Furthermore, I believe that the code it will break is already obscure so we needn't worry about it.
That's just what I was trying to say all along: statically nested scopes don't buy you anything except maybe for lambdas and nested functions (which is bad style programming, IMHO too). The only true argument for changing scoping I see is that of gained purity in language design... without much practical use. Other issues that need sorting out: x = 2 class C: x = 1 C = 'some string' def a(self): print x def b(self): global x x = 3 class D(C): C = 'some string' def a(self): C.a(self) print C o = C() o.a() o.b() o.a() o = D() o.a() What would the output look like under your proposal ? -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
"M.-A. Lemburg" wrote:
Jeremy Hylton wrote:
Looks like we need to rehash this thread at least enough to determine who is responsible for causing us to rehash it.
MAL said it would break code. I asked how. Skip and Tim obliged with examples. I said their examples exhibited bad style; neither of them claimed they were good style.
In the end, I observed that while it could break code in theory, I doubted it really would break much code. Furthermore, I believe that the code it will break is already obscure so we needn't worry about it.
That's just what I was trying to say all along: statically nested scopes don't buy you anything except maybe for lambdas and nested functions (which is bad style programming, IMHO too).
The only true argument for changing scoping I see is that of gained purity in language design... without much practical use.
Other issues that need sorting out:
x = 2 class C: x = 1 C = 'some string' def a(self): print x def b(self): global x x = 3
class D(C): C = 'some string' def a(self): C.a(self) print C
o = C() o.a() o.b() o.a()
o = D() o.a()
What would the output look like under your proposal ?
[Moshe pointed out to me in private mail that the above would continue to work as it does now due to a difference being made between class and function scoping categories] More questions: How are you going to explain the different scoping categories to a newbie ? What if you define a class within a method ? How can you explicitely attach a dynamically defined class to a certain scope ? More problems (?!): Nested scopes will introduce cycles in all frame objects. This means that with GC turned off, frame objects will live forever -- Python will eat up memory at a very fast pace. BTW, Python's GC only works for a few builtin types (frames are not among the supported types): what if a frame finds its way into a user defined type ? Will GC still be able to cleanup the cycles ? Perhaps I'm just being silly, but I still don't see the benefits of breaking todays easy-to-grasp three level scoping rules... -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
"M.-A. Lemburg"
Nested scopes will introduce cycles in all frame objects.
It doesn't have to be that way. A static link is only needed if a function actually uses any variables from an outer scope. In the majority of cases, it won't. And it's possible to do even better than that. You can separate out variables referred to in an inner scope and store them separately from the rests of the frame, so you only keep what's really needed alive.
This means that with GC turned off, frame objects will live forever
Don't allow GC to be turned off, then! (Presumably this feature would only be considered once GC has become a permanent feature of Python.)
BTW, Python's GC only works for a few builtin types (frames are not among the supported types)
But they could become so if necessary, surely? Greg Ewing, Computer Science Dept, +--------------------------------------+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | greg@cosc.canterbury.ac.nz +--------------------------------------+
[MAL]
That's just what I was trying to say all along: statically nested scopes don't buy you anything except maybe for lambdas and nested functions
That's a tautology -- that's what nested scopes are FOR!
(which is bad style programming, IMHO too).
Not always. The keyword argument hack is so common that it must serve a purpose, and that's what we're trying to fix -- for lambda's *and* nested functions (which are semantically equivalent anyway).
The only true argument for changing scoping I see is that of gained purity in language design... without much practical use.
But there is practical use: get rid of the unintuitive, unobvious and fragile keyword argument hack.
Other issues that need sorting out:
x = 2 class C: x = 1 C = 'some string' def a(self): print x def b(self): global x x = 3
class D(C): C = 'some string' def a(self): C.a(self) print C
o = C() o.a() o.b() o.a()
o = D() o.a()
What would the output look like under your proposal ?
This is a good point! If we considered the class as a nested scope here, I think it might break too much code, plus it would allow a new coding style where you could reference class variables without a self or <classname> prefix. I don't like that prospect, so I'm in favor for ruling this out. --Guido van Rossum (home page: http://www.python.org/~guido/)
If we get lexical scoping, there should be a fast (built-in) way to get at all the accessible names from Python. I.e. currently I can do d = globals().copy() d.update(locals()) and know that `d' contains a dictionary of available names, with the right overloading semantics. (PEP 42 now includes a feature request to make vars() do this by default.) -Barry
If we get lexical scoping, there should be a fast (built-in) way to get at all the accessible names from Python. I.e. currently I can do
d = globals().copy() d.update(locals())
and know that `d' contains a dictionary of available names, with the right overloading semantics. (PEP 42 now includes a feature request to make vars() do this by default.)
Note that I just deleted that feature request from PEP 42 -- vars() or locals() returns the dictionary containing the variables, and you can't just change the semantics to return a newly copied dictionary (which could be quite expensive too!). I don't think you need to have a mechanism to find all accessible names; I don't see a common use for that. It's sufficient to have a mechanism to look up any specific name according to whatever mechanism we decide upon. This is needed for internal use of course; it can also be useful for e.g. variable substitution mechanisms like the one you recently proposed or Ping's Itmpl. --Guido van Rossum (home page: http://www.python.org/~guido/)
"GvR" == Guido van Rossum
writes:
>> If we get lexical scoping, there should be a fast (built-in) >> way to get at all the accessible names from Python. >> I.e. currently I can do d = globals().copy() d.update(locals()) >> and know that `d' contains a dictionary of available names, >> with the right overloading semantics. (PEP 42 now includes a >> feature request to make vars() do this by default.) GvR> Note that I just deleted that feature request from PEP 42 -- GvR> vars() or locals() returns the dictionary containing the GvR> variables, and you can't just change the semantics to return GvR> a newly copied dictionary (which could be quite expensive GvR> too!). Saw that. I was just thinking that locals() already does what vars()-no-args does, so why have two ways to do the same thing? GvR> I don't think you need to have a mechanism to find all GvR> accessible names; I don't see a common use for that. It's GvR> sufficient to have a mechanism to look up any specific name GvR> according to whatever mechanism we decide upon. This is GvR> needed for internal use of course; it can also be useful for GvR> e.g. variable substitution mechanisms like the one you GvR> recently proposed or Ping's Itmpl. Ah, something like this then: -------------------- snip snip -------------------- import sys from UserDict import UserDict class NamesDict(UserDict): def __init__(self, frame): self.__frame = frame UserDict.__init__(self) def __getitem__(self, key): if self.data.has_key(key): return self.data[key] locals = self.__frame.f_locals if locals.has_key(key): return locals[key] globals = self.__frame.f_globals if globals.has_key(key): return globals[key] raise KeyError, key def _(s): try: raise 'oops' except: frame = sys.exc_info()[2].tb_frame.f_back return s % NamesDict(frame) theirs = 'theirs' def give(mine, yours): print _('mine=%(mine)s, yours=%(yours)s, theirs=%(theirs)s') -------------------- snip snip -------------------- Python 2.0 (#128, Oct 18 2000, 04:48:44) [GCC egcs-2.91.66 19990314/Linux (egcs-1.1.2 release)] on linux2 Type "copyright", "credits" or "license" for more information.
import dict dict.give('mine', 'yours') mine=mine, yours=yours, theirs=theirs
-Barry
[Barry]
Saw that. I was just thinking that locals() already does what vars()-no-args does, so why have two ways to do the same thing?
Not clear -- maybe one of them needs to be made obsolete.
GvR> I don't think you need to have a mechanism to find all GvR> accessible names; I don't see a common use for that. It's GvR> sufficient to have a mechanism to look up any specific name GvR> according to whatever mechanism we decide upon. This is GvR> needed for internal use of course; it can also be useful for GvR> e.g. variable substitution mechanisms like the one you GvR> recently proposed or Ping's Itmpl.
Ah, something like this then:
[Example deleted] I'm not sure that the mechanism provided should follow the mapping API. *All* it needs to do it provide lookup capability. Having .keys(), .items(), .has_key() etc. just slows it down. --Guido van Rossum (home page: http://www.python.org/~guido/)
[Barry A. Warsaw]
If we get lexical scoping, there should be a fast (built-in) way to get at all the accessible names from Python. I.e. currently I can do
d = globals().copy() d.update(locals())
and know that `d' contains a dictionary of available names, with the right overloading semantics.
It was long ago agreed (don't you love how I pull this stuff out of thin historical air <wink>?) that if nested lexical scoping was added, we would need also to supply a new mapping object that mimicked the full Python lookup rules (including builtins). Not necessarily a dictionary, though.
"TP" == Tim Peters
writes:
TP> It was long ago agreed (don't you love how I pull this stuff TP> out of thin historical air <wink>?) that if nested lexical TP> scoping was added, we would need also to supply a new mapping TP> object that mimicked the full Python lookup rules (including TP> builtins). Not necessarily a dictionary, though. Oh yeah, I vaguely remember now <0.5 scratch-head>. Works for me, although I'll point out that we needn't wait for lexical scoping to provide such an object! -Barry
participants (15)
-
Andrew Kuchling
-
barry@wooz.org
-
Eric S. Raymond
-
Fredrik Lundh
-
Gordon McMillan
-
Greg Ewing
-
Greg Stein
-
Guido van Rossum
-
Jeremy Hylton
-
Ken Manheimer
-
M.-A. Lemburg
-
Moshe Zadka
-
Skip Montanaro
-
Thomas Wouters
-
Tim Peters