On Fri, Dec 3, 2021 at 7:54 AM Eric V. Smith <eric@trueblade.com> wrote:
On 12/2/2021 2:21 PM, Brendan Barnwell wrote:
On 2021-12-02 01:35, Steven D'Aprano wrote:
>4) If "no" to question 1, is there some other spelling or other small >change that WOULD mean you would use it? (Some examples in the PEP.)
No. As I mentioned in the earlier thread, I don't support any proposal in which an argument can "have a default" but that default is not a first-class Python object of some sort. I don't understand this criticism.
Of course the default value will be a first-class Python object of some sort.*Every* value in Python is a first-class object. There are no machine values or unboxed values, and this proposal will not change that.
All that this proposal changes is*when* and*how often* the default will be evaluated, not the nature of the value.
As has happened often in these threads, it seems different people mean different things by "default value".
What you are calling "the default value" is "a thing that is used at call time if no value is passed for the argument". What I am calling "the default value" is "a thing that is noted at definition time to be used later if no value is passed for the argument".
What I'm saying is that I want that "thing" to exist. At the time the function is defined, I want there to be a Python object which represents the behavior to be activated at call time if the argument is not passed. In the current proposal there is no such "thing". The function just has behavior melded with its body that does stuff, but there is no addressable "thing" where you can say "if you call the function and the argument isn't passed were are going to take this default-object-whatchamacallit and 'use' it (in some defined way) to get the default value". This is what we already have for early-bound defaults in the function's `__defaults__` attribute.
I also have this objection to the proposal (among other concerns).
Say I have a function with an early-bound default. I can inspect it and I can change it. One reason to inspect it is so that I can call the function with its default values. This is a form of wrapping the function. I realize "just don't pass that argument when you call the function" will be the response, but I think in good faith you'd have to admit this is more difficult than just passing some default value to a function call.
1) I want to call this function 2) I may want to not pass this argument 3) Ah, perfect! I will pass this argument with a value of somemod._SENTINEL. Or alternatively: 1) I want to call this function. 2) Prepare a dictionary of arguments. Leave out what I don't want. 3) If I want to pass this argument, add it to the dictionary. This way doesn't require reaching into the function's private information to use a sentinel. Yes, it may be a tad more difficult (though not VERY much), but you're also avoiding binding yourself to what might be an implementation detail.
As far as changing the defaults, consider:
def f(x=3): return x ... f() 3 f.__defaults__=(42,) f() 42
The current PEP design does not provide for this functionality for late-bound defaults.
Remember, though: the true comparison should be something like this: _SENTINEL = object() def f(x=_SENTINEL): if x is _SENTINEL: x = [] return x Can you change that from a new empty list to something else? No. All you can do, by mutating the function's dunders, is change the sentinel, which is actually irrelevant to the function's true behaviour. You cannot change the true default. Consider also this form: default_timeout = 500 def connect(s, timeout=default_timeout): ... def read(s, timeout=default_timeout): ... def write(s, msg, timeout=default_timeout): ... You can now, if you go to some effort, replace the default in every function. Or you can do this, and not go to any effort at all: def read(s, timeout=>default_timeout): ... The true default is now exactly what the function signature says. And if you really want to, you CAN change read.__defaults__ to have an actual early-bound default, which means it will then never check the default timeout. Introspection is no worse in this way than writing out the code longhand. It is significantly better, because even though you can't change it from a latebound default_timeout to a latebound read_timeout, you can at least see the value with external tools. You can't see that if the default is replaced in the body of the function.
I realize the response will be that code shouldn't need to do these things, but I do not think we should be adding features to python that limit what introspections and runtime modifications user code can do.
The response is more that the code CAN'T do these things, by definition. To the extent that you already can, you still can. To the extent that you should be able to, you are still able to. (And more. There are things you're capable of with PEP 671 that you definitely shouldn't do in normal code.)
A classic example of this is PEP 362 function signature objects. I don't think we should be adding parameter types that cannot be represented in a Signature, although of course a Signature might need to be extended to support new features. Signature objects were added for a reason (see the PEP), and I don't think we should just say "well, that's not important for this new feature". Also note that over time we've removed restrictions on Signatures (see, for example, Argument Clinic). So I don't think adding restrictions is the direction we want to go in.
Same again. If you consider the equivalent to be a line of code in the function body, then the signature has become MASSIVELY more useful. Instead of simply seeing "x=<object object at 0x7fba1b318690>", you can see "x=>[]" and be able to see what the value would be. It's primarily human-readable (although you could eval it), but that's still a lot better than seeing a meaningless sentinel. And yes, you can make help() more readable by using a nicely named sentinel, but then you have to go to a lot more effort in your code, worry about pickleability, etc, etc. Using a late-bound default lets you see the true default, not a sentinel. ChrisA