Having had some time to let this settle for a bit, I hope it doesn't get abandoned just because it was to complicated to come to a conclusion.
I'd like to attempt to summarize the main ideas as follows.
1) Currently the "fast call" optimization is implemented by by checking explicitly for a set of types (builtin functions, methods method descriptors, and functions). This is both ugly, as it requires listing several cases, and also locks out any other types from participating in this protocol. This PEP proposes elevating this to a contract that other types can participate in.
2) Inspect and friends are hard-coded checks on these non-extendable types, again making it difficult for other types to be truly first class citizens, and breaks attempts at duck typing.
3) The current hierarchy of builtin_function_or_method vs. function vs. instancemethod could use some cleanup for consistency and extensibility.
PEP 575 solves all of these by introducing a common base class, but they are somewhat separable. As for complexity, there are two metrics, the complexity of the delta (e.g. more lines of code in trickier places = worse, paid once) and of the final result (less code, less special casing = better, paid as long as the code is in use). I tend to think it's a good tradeoff to pay former to improve the latter.
Jeroen, is this a fair summary? Are they fully separable?
Others, are these three valuable goals? At what cost (e.g. (3) may have backwards compatibility concerns if taken as far as possible.)
On Sun, May 20, 2018 at 1:15 PM, Jeroen Demeyer J.Demeyer@ugent.be wrote:
On 2018-05-19 15:29, Nick Coghlan wrote: >
That's not how code reviews work, as their complexity is governed by the number of lines changed (added/removed/modified), not just the number of lines that are left at the end.
Of course, you are right. I didn't mean literally that only the end result matters. But it should certainly be considered.
If you only do small incremental changes, complexity tends to build up because choices which are locally optimal are not always globally optimal. Sometimes you need to do some refactoring to revisit some of that complexity. This is part of what PEP 575 does.
That said, "deletes more lines than it adds" is typically a point strongly in favour of a particular change.
This certainly won't be true for my patch, because there is a lot of code that I need to support for backwards compatibility (all the old code for method_descriptor in particular).
Going back to the review of PEP 575, I see the following possible outcomes:
(A) Accept it as is (possibly with minor changes).
(B) Accept the general idea but split the details up in several PEPs which can still be discussed individually.
(C) Accept a minimal variant of PEP 575, only changing existing classes but not changing the class hierarchy.
(D) Accept some yet-to-be-written variant of PEP 575.
(E) Don't fix the use case that PEP 575 wants to address.
Petr Viktorin suggests (C). I am personally quite hesitant because that only adds complexity and it wouldn't be the best choice for the future maintainability of CPython. I also fear that this hypothetical PEP variant would be rejected because of that reason. Of course, if there is some general agreement that (C) is the way to go, then that is fine for me.
If people feel that PEP 575 is currently too complex, I think that (B) is a very good compromise. The end result would be the same as what PEP 575 proposes. Instead of changing many things at once, we could handle each class in a separate PEP. But the motivation of those mini-PEPs will still be PEP 575. So, in order for this to make sense, the general idea of PEP 575 needs to be accepted: adding a base_function base class and making various existing classes subclasses of that.
Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/robertwb%40gmail.com