
I'll get this out of the way now. This is probably a dumb idea. However, I have found myself working around a use case lately and have a thought of a cleaner solution. The Situation --------------- As you know, you can use default arguments when defining a function. Sometimes you want to pass an argument through from a function's arguments to another function that the original ("outer") calls. I run into this during inheritance-related calls and when interfacing between two compatible APIs. Here's an example: class X: def f(self, name="N/A"): ... class Y(X): def f(self, name="N/A"): super().f(name) Y().f() So, the name argument is passed through from Y's f() to X's. In this case, we want to keep the default argument the same. Current Solutions --------------- Keeping the default arguments the same can be accomplished in a few different ways right now, but they are less than ideal: * Manually keep the two sync'ed (like above), a DRY violation. * Use introspection at definition time (using X.f.__defaults__ and X.f.__kwdefaults__), which is fragile (if the X.f attributes are changed later or the function definition's signature changes). * Use a dummy value to indicate that the default should be supplied by X's f(). This requires some clutter in the function and possibly a less obvious default argument. Here's an example of that last one: class DeferredType: pass DEFERRED = DeferredType() class X1: def f(self, name="N/A"): if name is DEFERRED: name = "N/A" # DRY #name = X1.f.__defaults__[0] # fragile ... class Y1(X1): def f(self, name=DEFERRED): super().f(name) Of course, the DEFERRED object could be any other specific object, particularly None. Here's another similar example that reminds me of Jack's example[1] from the recent thread on mutable default arguments. It is the solution I am using currently. class X2: def f(self, name=None): if name is None: name = "N/A" ... class Y2(X2): def f(self, name=None): super().f(name) Of course, using DEFERRED instead of None would allow you to use None as the actual default argument, but you get the idea. Notice that the actual default argument is no longer in the argument list, so it is less obvious and no longer introspect-able. I have also run into this use-case outside of inheritance situations. A New Solution --------------- Provide a builtin version of a Deferred singleton, like None or NotImplemented. When resolving arguments for a call, if an argument is this Deferred object, replace it with the default argument for that parameter on the called function, if there is one. If there isn't, act as though that argument was not passed at all. Here's how it would look: class X: def f(self, name="N/A"): ... class Y(X): def f(self, name=Deferred): super().f(name) [2] This way, Y's f has a sensible default argument for "name"[3] and implies where to find the actual default argument (on super().f)[4]. X's f is not cluttered up and has a clear, introspect-able default argument. Thoughts? -eric p.s. while I usually pursue random flights of fancy (answers looking for questions), this idea was born of an honest-to-goodness practical use-case I use semi-frequently. One thing I've learned about the Python community is that ideas that come from real [and common] use-cases have a much better chance of not dying a quick and painful death (not holding my breath here though <wink>). [1] http://mail.python.org/pipermail/python-ideas/2011-May/010263.html [2] As a bonus, an "outer" function could take advantage of different default arguments when calling more than one function: def f(name="N/A"): ... def g(name="Undefined"): ... def h(name=Deferred): f(name) g(name) [3] "Deferred" indicates what kind of argument it is in a straightforward way. However, I don't want the name to be confused with anything like Twisted Deferred objects, so whatever name is clear would, of course, be appropriate. [4] That is true from a people perspective, but not for introspection; e.g. if you want to introspect the actual default argument for Y.f, you would have to be able to programmatically determine how that parameter is used in the function body...

On 7/13/2011 3:26 PM, Eric Snow wrote:
I believe class Y(X): def f(self, name=None): super().f(name) f.__defaults__ = X.f.__defaults__ will more or less do what you want. Using 'super()' instead of 'X' does not seem to work. The default replacement might be done with a function or class decorator.
Y().f()
prints 'N/A' -- Terry Jan Reedy

On Wed, Jul 13, 2011 at 3:36 PM, Terry Reedy <tjreedy@udel.edu> wrote:
Yeah, but if the defaults of X.f get changed at runtime, after the definition of Y, the defaults for Y.f will likely be out of sync. Also, usually I want to be selective about which defaults I assume. Following your recommendation would involve calculating a new __defaults__ and a new __kwdefaults__. PEP 362 would make this easier, but probably won't be done for a while. However, you're right that a decorator could probably handle this. The downside is that it would not be a trivial decorator to get what I am after. Thanks for the feedback. -eric

Eric Snow wrote:
The problem is wider than inheritance of classes. I often have a series of functions that share identical signatures, including defaults: def mean(data, missing=False): pass def variance(data, missing=False): pass def pvariance(data, missing=False): pass def stdev(data, missing=False): pass def pstdev(data, missing=False): pass For API consistency, a change to one signature requires corresponding changes to the others. I'm not *entirely* sure that this is a problem that needs solving, although it is a nuisance. (See my final paragraph, below.) What do other languages provide? Wild idea: if Python had a "signature object", you could do something like this: common = signature(data, missing=False) def mean*common: pass where poor old * gets yet another meaning, namely "use this signature as the function parameter list". (I'm not wedded to that syntax, it was just the first idea that came to mind.)
This becomes messy if you only want to "inherit" the default value for one parameter rather than all.
I believe that the usual name for this is a sentinel.
I don't believe that should be a public object. If it's public, people will say "Yes, but what do you do if you want the default to actually be the Deferred singleton?" It's the None-as-sentinel problem all over again... sometimes you want None to stand in for no value, and sometimes you want it to be a first class value. Better(?) to make it syntax, and overload * yet again: def f(arg, another_arg, name=*): pass This has the advantage(?) that inside the body of f, name doesn't get a spurious value Deferred. If the caller doesn't supply a value for name, and f tries to use it (except as below), then you get an unambiguous UnboundLocalError. The exception is, calling another function with it as an argument is permitted. This call in the body of the function: result = g(1, 3, name, keyword="spam") behaves something like this: try: name except NameError: result = g(1, 3, keyword="spam") else: result = g(1, 3, name, keyword="spam") only handled by the compiler. Seems awfully complicated just to solve a few DRY violations. I'm not sure it is worth the added complexity of implementation, and the added cognitive burden of learning about it. That's not to say that DRY violations in function signatures aren't a real problem -- I've run into them myself. But I question how large a problem they are, in practice. Language support solving them seems to me to be a case of using a sledgehammer to crack a peanut. DRY is mostly a problem for implementation, not interface: implementation should remain free to change rapidly, and so repeating yourself makes that harder. But interface (e.g. your library API, including function signatures) should be static for long periods of time, so redundancy in function signatures is less important. -- Steven

Steven D'Aprano wrote:
If you think you might change your mind about the default value of "missing", you can do default_missing = False def mean(data, missing=default_missing): ... def variance(data, missing=default_missing): ... etc.
That just leads to another version of the "None-as-sentinel" problem, though. What happens if you don't want to *directly* pass it to another function, but via some intermediate name or data structure? And how do debuggers deal with it? -- Greg

On Wed, Jul 13, 2011 at 7:23 PM, Steven D'Aprano <steve@pearwood.info> wrote:
Wild idea: if Python had a "signature object", you could do something like
You mean like PEP 362? I would love it. Brett has talked about taking it from PyPI to 3.3, but I'm guessing he is still pretty busy with life. (There, I just poked the tracker issue)
Yeah, I had meant that, and agree it is messy.
The idea is to have an object that has no meaning in normal code. Sure we use None all over, but the sample size for builtin singletons in Python is pretty small to say that would happen with a new one. But maybe None started as an object that didn't have any meaning in normal code (doubt it). Agreed that if it takes meaning it won't be much different than using None as a sentinel. However, it would still not be None, which would still be worth it.
Yeah, that is neat, but it is a lot more complicated than having a builtin singleton that implicitly triggers a simple(?) behavior when passed as an argument to a function. You can trust me that my way is better! I'm an expert on the inner workings of the CPython implementation! <wink> At least, the singleton *seems* conceptually less complicated to me.
I gotta say I love how practical Python and its community is. Not only do things have to be useful, but widely useful and substantially more useful than the existing way of doing it. I remember Nick said something along those lines a few months ago. I think that attitude has kept the language awesome. In this situation, you're probably right that the use case isn't large enough in practice to be important for the language. For now I guess I can do something like this, which is along the lines of what I was imagining would happen implicitly: def handle_deferred(f, name, locals_, fname=None): # simpler with a function signature object... if not fname: fname = name if locals_[name] is not DEFERRED: return locals_[name] if name in f.__code__.co_kwonlyargs: try: return f.__kwdefaults__[fname] except KeyError: # TypeError would normally be raised if f were called and a # non-default parameter were not passed an argument so # that's what we'll raise here and below. raise TypeError("Can't convert DEFERRED for {}.format(name)) default_names = f.__code__._co_varnames[ f.__code__.co_argcount - len(f.__defaults__): f.__code__.co_argcount] try: return f.__defaults__[default_names.index(fname)] except ValueError: raise TypeError("Can't convert DEFERRED for {}.format(name)) def f(x=5): ... def g(x=DEFERRED): ... x = handle_deferred(f, "x", locals()) f(x) Thanks for having a look. -eric

If you want perfect API compatibility from the sub-class to the super, why not just use *args, **kwargs? It kills the method signature, I guess, but just make your docstring say, "Check my superclass for the exact calling API."

On Thu, Jul 14, 2011 at 11:23 AM, Steven D'Aprano <steve@pearwood.info> wrote:
+1 A PEP 362 based answer would likely involve assigning to __signature__ on the wrapping function and making good use of Signature.bind() inside it. There are also much bigger problems that PEP 362 will let us solve (e.g. decent introspection of builtin functions, and functions that use pass-through *args and **kwds). Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On 7/13/2011 3:26 PM, Eric Snow wrote:
I believe class Y(X): def f(self, name=None): super().f(name) f.__defaults__ = X.f.__defaults__ will more or less do what you want. Using 'super()' instead of 'X' does not seem to work. The default replacement might be done with a function or class decorator.
Y().f()
prints 'N/A' -- Terry Jan Reedy

On Wed, Jul 13, 2011 at 3:36 PM, Terry Reedy <tjreedy@udel.edu> wrote:
Yeah, but if the defaults of X.f get changed at runtime, after the definition of Y, the defaults for Y.f will likely be out of sync. Also, usually I want to be selective about which defaults I assume. Following your recommendation would involve calculating a new __defaults__ and a new __kwdefaults__. PEP 362 would make this easier, but probably won't be done for a while. However, you're right that a decorator could probably handle this. The downside is that it would not be a trivial decorator to get what I am after. Thanks for the feedback. -eric

Eric Snow wrote:
The problem is wider than inheritance of classes. I often have a series of functions that share identical signatures, including defaults: def mean(data, missing=False): pass def variance(data, missing=False): pass def pvariance(data, missing=False): pass def stdev(data, missing=False): pass def pstdev(data, missing=False): pass For API consistency, a change to one signature requires corresponding changes to the others. I'm not *entirely* sure that this is a problem that needs solving, although it is a nuisance. (See my final paragraph, below.) What do other languages provide? Wild idea: if Python had a "signature object", you could do something like this: common = signature(data, missing=False) def mean*common: pass where poor old * gets yet another meaning, namely "use this signature as the function parameter list". (I'm not wedded to that syntax, it was just the first idea that came to mind.)
This becomes messy if you only want to "inherit" the default value for one parameter rather than all.
I believe that the usual name for this is a sentinel.
I don't believe that should be a public object. If it's public, people will say "Yes, but what do you do if you want the default to actually be the Deferred singleton?" It's the None-as-sentinel problem all over again... sometimes you want None to stand in for no value, and sometimes you want it to be a first class value. Better(?) to make it syntax, and overload * yet again: def f(arg, another_arg, name=*): pass This has the advantage(?) that inside the body of f, name doesn't get a spurious value Deferred. If the caller doesn't supply a value for name, and f tries to use it (except as below), then you get an unambiguous UnboundLocalError. The exception is, calling another function with it as an argument is permitted. This call in the body of the function: result = g(1, 3, name, keyword="spam") behaves something like this: try: name except NameError: result = g(1, 3, keyword="spam") else: result = g(1, 3, name, keyword="spam") only handled by the compiler. Seems awfully complicated just to solve a few DRY violations. I'm not sure it is worth the added complexity of implementation, and the added cognitive burden of learning about it. That's not to say that DRY violations in function signatures aren't a real problem -- I've run into them myself. But I question how large a problem they are, in practice. Language support solving them seems to me to be a case of using a sledgehammer to crack a peanut. DRY is mostly a problem for implementation, not interface: implementation should remain free to change rapidly, and so repeating yourself makes that harder. But interface (e.g. your library API, including function signatures) should be static for long periods of time, so redundancy in function signatures is less important. -- Steven

Steven D'Aprano wrote:
If you think you might change your mind about the default value of "missing", you can do default_missing = False def mean(data, missing=default_missing): ... def variance(data, missing=default_missing): ... etc.
That just leads to another version of the "None-as-sentinel" problem, though. What happens if you don't want to *directly* pass it to another function, but via some intermediate name or data structure? And how do debuggers deal with it? -- Greg

On Wed, Jul 13, 2011 at 7:23 PM, Steven D'Aprano <steve@pearwood.info> wrote:
Wild idea: if Python had a "signature object", you could do something like
You mean like PEP 362? I would love it. Brett has talked about taking it from PyPI to 3.3, but I'm guessing he is still pretty busy with life. (There, I just poked the tracker issue)
Yeah, I had meant that, and agree it is messy.
The idea is to have an object that has no meaning in normal code. Sure we use None all over, but the sample size for builtin singletons in Python is pretty small to say that would happen with a new one. But maybe None started as an object that didn't have any meaning in normal code (doubt it). Agreed that if it takes meaning it won't be much different than using None as a sentinel. However, it would still not be None, which would still be worth it.
Yeah, that is neat, but it is a lot more complicated than having a builtin singleton that implicitly triggers a simple(?) behavior when passed as an argument to a function. You can trust me that my way is better! I'm an expert on the inner workings of the CPython implementation! <wink> At least, the singleton *seems* conceptually less complicated to me.
I gotta say I love how practical Python and its community is. Not only do things have to be useful, but widely useful and substantially more useful than the existing way of doing it. I remember Nick said something along those lines a few months ago. I think that attitude has kept the language awesome. In this situation, you're probably right that the use case isn't large enough in practice to be important for the language. For now I guess I can do something like this, which is along the lines of what I was imagining would happen implicitly: def handle_deferred(f, name, locals_, fname=None): # simpler with a function signature object... if not fname: fname = name if locals_[name] is not DEFERRED: return locals_[name] if name in f.__code__.co_kwonlyargs: try: return f.__kwdefaults__[fname] except KeyError: # TypeError would normally be raised if f were called and a # non-default parameter were not passed an argument so # that's what we'll raise here and below. raise TypeError("Can't convert DEFERRED for {}.format(name)) default_names = f.__code__._co_varnames[ f.__code__.co_argcount - len(f.__defaults__): f.__code__.co_argcount] try: return f.__defaults__[default_names.index(fname)] except ValueError: raise TypeError("Can't convert DEFERRED for {}.format(name)) def f(x=5): ... def g(x=DEFERRED): ... x = handle_deferred(f, "x", locals()) f(x) Thanks for having a look. -eric

If you want perfect API compatibility from the sub-class to the super, why not just use *args, **kwargs? It kills the method signature, I guess, but just make your docstring say, "Check my superclass for the exact calling API."

On Thu, Jul 14, 2011 at 11:23 AM, Steven D'Aprano <steve@pearwood.info> wrote:
+1 A PEP 362 based answer would likely involve assigning to __signature__ on the wrapping function and making good use of Signature.bind() inside it. There are also much bigger problems that PEP 362 will let us solve (e.g. decent introspection of builtin functions, and functions that use pass-through *args and **kwds). Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
participants (6)
-
Carl Johnson
-
Eric Snow
-
Greg Ewing
-
Nick Coghlan
-
Steven D'Aprano
-
Terry Reedy