On Thu, Jun 24, 2021 at 7:51 PM Steven D'Aprano
I'm not sure if you completely understand the use-case I was describing, so let me clarify for you with a concrete example.
Ints have a "bit_length" method, starting from Python 2.7. I needed to use that method going all the way back to version 2.4. I have an implementation that works, so I could backport that method to 2.4 through 2.6, except that you can't monkey-patch builtins in Python.
So monkey-patching is out.
(And besides, I wouldn't want to monkey-patch it: I only need that method in one module. I want to localise the change to only where it is needed.)
Subclassing int wouldn't help. I need it to work on actual ints, and any third-party subclasses of int, not just my own custom subclass.
(And besides, have you tried to subclass int? It's a real PITA. It's easy enough to write a subclass, but every operation on it returns an actual int instead of the subclass. So you have to write a ton of boilerplate to make int subclasses workable. But I digress.)
So a subclass is not a good solution either.
That leaves only a function.
But that hurts code readability and maintainance. In 2.7 and above, bit_length is a method, not a function. All the documentation for bit_length assumes it is a method. Every tutorial that uses it has it as a method. Other code that uses it treats it as a method.
Except my code, where it is a function.
Using a function is not a *terrible* solution to the problem of backporting a new feature to older versions of Python. I've done it dozens of times and it's not awful. **But it could be better.**
Why can't the backport be a method, just like in 2.7 and above?
With extension methods, it can be.
You've given some great arguments for why (5).bit_length() should be allowed to be a thing. (By the way - we keep saying "extension METHODS", but should this be allowed to give non-function attributes too?) But not once have you said where getattr(), hasattr(), etc come into this. The biggest pushback against this proposal has been the assumption that getattr(5, "bit_length")() would have to be the same as (5).bit_length(). Why is that necessary? I've never seen any examples of use-cases for that. Let's tighten this up into a real proposal. (I'm only +0.5 on this, but am willing to be swayed.) * Each module has a registration of (type, name, function) triples. * Each code object is associated with a module. * Compiled code automatically links the module with the code object. (If you instantiate a code object manually, it's on you to pick a module appropriately.) * Attribute lookups use three values: object, attribute name, and module. * If the object does not have the attribute, its MRO is scanned sequentially for a registered method. If one is found, use it. Not mentioned in this proposal: anything relating to getattr or hasattr, which will continue to look only at real methods. There may need to be an enhanced version of PyObject_GetAttr which is able to look up extension methods, but the current one simply wouldn't. Also not mentioned: ABC registration. If you register a class as a subclass of an ABC and then register an extension method on that class, isinstance() will say that it's an instance of the ABC, but the extension method won't be there. I'm inclined to say "tough luck, don't do that", but if there are strong enough use cases, that could be added. But otherwise, I would FAR prefer a much simpler proposal, one which changes only the things that need to be changed. ChrisA