My proposal, I believe, is exactly the same as the “Reduced Special Methods set” of PEP 335; that’s the phase two only part. I believe the __r*__ methods there are the same as my proposal. A portion of the complexity of PEP 335 is used in the “phase 1” special methods, which are only helpful if you want to overload the shortcutting behavior itself on a per object basis, which I don’t think there is much need for. Assuming your methods are part of the class, as Python usually requires anyway, this would allow a class to either be short circuiting, boolean only (not containing __and2__ or __sand__ or __xand__ or whatever it would be called), or would have them and therefore would have customized non-short-circuiting behavior. (But, due to the fact that the class controls how much calculation happens, this generally can still limit computation, as in my Plumbum example; the only requirement is that the variable exist so it can be passed to the function).

If it looks like this would be acceptable to the NumPy folks, then maybe PEP 335 could be changed to make the Reduced Special Methods the main suggestion, with extending to the two phase method as a side note?


Henry Schreiner

On Nov 30, 2015, at 2:28 PM, Andrew Barnert <abarnert@yahoo.com> wrote:

PEP 335 uses the standard rules for 'r' methods, so y.__rand2__ will only be used if issubclass(type(y), type(x)) or if x.__and2__ doesn't exist. Wouldn't having different rules for 'r' methods in this case than for all other operators be confusing? (Maybe not; the comparison operators are the closest analogs, and they have special rules too...)

Meanwhile, looking at PEP 335 again, it seems like we could very easily add just the "phase-2" part today (which would allow non-short-circuiting overloads, while preserving short-circuiting for types that don't overload), and then add the "phase-1" part (which would add optional short-circuiting overloads) later if needed, or just never add it if not.