On Jun 26, 2019, at 07:34, Anders Hovmöller <boxed@killingar.net> wrote:
I 100% agree that this proposal is a bad idea. But I do have to play Devils advocate here.
The the-code-is-understandable-at-face-value ship has already sailed. + doesn't mean add, it means calling a dunder function that can do anything.
No, + does mean add. But Python doesn’t know what it means to add two Fraction or Decimal or ndarray objects, so if you’re the one writing that class, you have to tell it. It still means add—unless you lie to your readers. And you can always lie to your readers; dunder methods aren’t needed for that. Sure, you could define Fraction.__add__(self, other) to print self to a file named str(other) and return clock(), but you could just as easily store the numerator in an attribute named “denominator”, or name the class “EmployeeRecord” instead of “Fraction”, or store the Fraction 1/2 in a variable named “pathname”. It’s not up to Python to prevent you from lying to your readers.
Foo.bar = 1 doesn't mean set bar to 1 but calling a dunder method.
No, it does mean setting bar to 1. The only difference between __add__ and __setattr__ is that the latter has default behavior that works for many classes. If your type(Foo) class has disk-backed attributes or immutable attributes or attributes that dir in reverse order of assignment rather than arbitrary order, you have to tell Python how to do that. Unless you’re lying, you’re defining what it means to set the bar attribute to 1, not defining Foo.bar = 1 to mean something different from setting the bar attribute. The problem isn’t that __setself__ could be used to lie; the problem is that __setself__ can’t be used in a way that isn’t lying. None of the suggested examples are about providing a way to define what binding x to 1 means in the local/classdef/global namespace, they’re all about providing a way to make x = 1 not mean binding x to 1 in that namespace at all. In particular, the best example we’ve seen amounts to “Python doesn’t have a send operator like <- so instead of adding one, let’s allow people to misuse = to mean send instead of than assign”. The obvious way to justify this is by appeal to descriptors: the __set__ method isn’t there because people want to use descriptors directly, it’s there because people do want to use classes with custom attributes like properties, classmethods, etc. and descriptors make defining those classes easier. Maybe in a better example, we’d see that __setself__ is similarly there to make it easier to define namespaces with custom bindings easier, and people do want those namespaces. If so, the reason not everyone is convinced is that we’ve only seen bad examples so far. But then someone needs to give a good example.
In python code basically can't be understood at face value.
This is the Humpty Dumpty argument from Alice. English can’t be understood at face value if Humpty can use any word to mean anything he wanted, rather than what Alice expected that word to mean. And yet, among normal speakers—even with slightly different idiolects, even in discourses that explicitly redefine words (as with most math papers, which is probably what Lewis Carroll has in mind)—English actually can be understood, it just can’t prevent Humpty from misusing it.