On Sat, Dec 04, 2021 at 10:50:14PM +1100, Chris Angelico wrote:
syntactic sugar for this:
def f(b, x=lambda b: a+b): ...
except that the lambda has the LB flag set.
Okay. So the references to 'a' and 'b' here are one more level of function inside the actual function we're defining, which means you're paying the price of nonlocals just to be able to late-evaluate defaults. Not a deal-breaker, but that is a notable cost (every reference to them inside the function will be slower).
How much slower? By my tests: - access to globals is 25% more expensive than access to locals; - access to globals is 19% more expensive than nonlocals; - and nonlocals are 6% more expensive than locals. Or if you do the calculation the other way (the percentages don't match because the denominators are different): - locals are 20% faster than globals; - and 5% faster than nonlocals; - nonlocals are 16% faster than globals. Premature optimization is the root of all evil. We would be better off spending effort making nonlocals faster for everyone than throwing out desirable features and a cleaner design just to save 5% on a microbenchmark. [...]
What this does mean, though, is that there are "magic objects" that cannot be used like other objects.
NotImplemented says hello :-) You are correct that one cannot use a LB function as a standard, early bound default without triggering the "evaluate this at call time" behaviour. If we're happy with this behaviour, it would need to be documented for people to ignore *wink* There's precedence though. You cannot overload an operator method to return NotImplemented without triggering the special "your object doesn't support this operator" behaviour. And there are two obvious workarounds: 1. Just pass the LB function in as an explicit argument. The trigger only operates when looking up a default, not on every access to a function. 2. Or you can wrap the LB function you actually want to be the default in a late-bound expression that returns that function. And if you still think that we should care, we can come up with a more complex trigger condition: - the parameter was flagged as using a late-default; - AND the default is a LB function. Problem solved. Now you can use LB functions as early-bound defaults, and all it costs is to record and check a flag for each parameter. Is it worth it? Dunno. [...]
The default expression is just a function (with the new LB flag set). So we can inspect its name, its arguments, its cell variables, etc:
>>> default_expression.__closure__ (<cell at 0x7fc945de74f0: int object at 0x7fc94614c0f0>,)
We can do anything that we could do with any other other function object.
Yup. As long as it doesn't include any assignment expressions, or anything else that would behave differently.
I don't get what you mean here. Functions with the walrus operator are still just functions that we can introspect:
f = lambda a, b: (len(w:=str(a))+b)*w f('spam', 2) 'spamspamspamspamspamspam' f.__code__ <code object <lambda> at 0x7fc945e07c00, file "<stdin>", line 1>
What sort of "behave differently" do you think would prevent us from introspecting the function object? "Differently" from what?
Great. So now we have some magnificently magical behaviour in the language, which will have some nice sharp edge cases, but which nobody will ever notice. Totally. I'm sure.
NotImplemented. Document it and move on. There are two work-arounds for those who care. And if you still think it matters, you can record a flag for each parameter recording whether it actually used a late-bound default or not.
Plus, we pay a performance price in any function that makes use of argument references, not just for the late-bound default, but in the rest of the code.
Using a late-bound default doesn't turn every local variable in your function into a cell variable. For any function that does a meaningful amount of work, the cost of making one or two parameters into cell variables instead of local variables is negligible. At worst, if you do *no other work at all*, it's a cost of about 5% on two-fifths of bugger-all. But if your function does a lot of real work, the difference between using cell variables instead of locals is going to be insignificant compared to ~~the power of the Force~~ the rest of the work done in the function. And if you have some unbelievably critical function that you need to optimize up the wahzoo? def func(a, b=None): if b is None: # Look ma, no cell variables! b = expression Python trades off convenience for speed and safety all the time. This will just be another such example. You want the convenience of a late-bound default? Use this feature. You want it to be 3ns faster? Use the old "if arg is None" idiom. Or write your code in C, and make it 5000000000ns faster.
We also need to have these special functions that get stored as separate code objects.
That's not a cost, that's a feature. Seriously. We're doing that so that we can introspect them individually, not just as the source string, but as actual callable objects that can be: - introspected; - tested; - monkey-patched and modified in place (to the degree that any function can be modified, which is not a lot); - copied or replaced with a new function. Testing is probably the big one. Test frameworks will soon develop a way to let you write tests to confirm that your late bound defaults do what you expect them to do. That's trivial for `arg=[]` expressions, but for complex expressions in complex functions, being able to isolate them for testing is a big plus. -- Steve