
On Wed, 22 Jun 2022 at 08:54, David Mertz, Ph.D. <david.mertz@gmail.com> wrote:
I haven't gotten to writing that into the PEP yet, but I think the rule has to be to take the scope of evaluation not the scope of definition. I know it needs to be there, and I'm just thinking about helpful examples.
Which is to say the semantics are more like `eval()` than like a lambda closure.
... and I know this is going to raise the neck hairs of many folks, because effectively I'm proposing a kind of dynamic scoping. Possibly in my defense, I think Carl's PEP 690 can do the same thing. :-)
Okay. So, in order to make your generic deferreds do the same job as late-bound argument defaults, you need: 1) All deferreds to be evaluated in the scope of evaluation, not the scope of definition. This goes completely against all other Python scoping rules. 2) Deferreds used in argument defaults have to be re-deferred every time the function is called, to avoid coalescing to the exact same object every time. 3) And then you still have to do "n=n" at the start of the function to un-defer it. Point #1 is majorly problematic, because *every function* will have to be deoptimized the way that a star import would deoptimize a Python 2 function. Consider: def f(x): from sys import * print(x) x += y This doesn't work in Python 3, but in Python 2, this deoptimizes the function and stops it from knowing what *any* name means. In fact, in a closure, this is actually disallowed, because it's impossible to know what variables from the outer function would need to be retained in a closure. Making deferreds work the way you say would have this effect on *every function*. Consider: y = 1 def f(x): print("x is", x) print("y is", y) def g(): f(later y:=2) What is the scope of y when referred to in x? If a deferred object retains the scope of its definition, then f is a simple function that behaves sanely (it's possible for arbitrary code to be executed when you refer to a simple name, but that's basically just extending the concept of @property to all namespaces); y will be the global, and upon evaluating x, the local inside g would be reassigned. But if a deferred object is evaluated in the scope where it's referenced, f has to reassign y. That means it has to be compiled to be compatible with y being reassigned, simply because x could be a deferred object. EVERY function has to assume that EVERY name could be rebound in this way. It will become impossible to statically analyze anything, even to knowing what type of name something is. Is that what you actually want? Point #2 is also problematic. In every other way, an early-bound default is evaluated at call time. Deferred objects should be evaluated only once, which means that this will only print once: x = later print("Hello") x x But that means that, unless special magic is done, this will generate only a single list: def f(x=later []): x.append(1) return x The deferred object will be undeferred into a single list, and every subsequent call will reuse that same object. So you'd need some way to signal to the compiler that you want this particular deferred object to be redeferred every call, but undeferred within it. In fact, what you really want is for the deferred object to be created as the function begins, NOT as it's defined. In other words, to get PEP 671 semantics with deferred objects, you basically need PEP 671 semantics, plus the unnecessary overhead of packaging it up and then requiring that the programmer force evaluation at the start of the function. ChrisA