On Thu, Feb 7, 2019 at 4:03 PM Steven D'Aprano
At the risk of causing confusion^1, we could have a "vector call" syntax:
# apply len to each element of obj, instead of obj itself len[obj]
which has the advantage that it only requires that we give functions a __getitem__ method, rather than adding new syntax. But it has the disadvantage that it doesn't generalise to operators, without which I don't think this is worth bothering with.
Generalizing to operators is definitely going to require new syntax, since both operands can be arbitrary objects. So if that's essential to the idea, we can instantly reject anything that's based on functions (like "make multiplying a function by a tuple equivalent to blah blah blah"). In that case, we come straight to a few key questions: 1) Is this feature even worth adding syntax for? (My thinking: "quite possibly", based on matmul's success despite having an even narrower field of use than this.) 2) Should it create a list? a generator? something that depends on the type of the operand? (Me: "no idea") 2) Does the Julia-like "x." syntax pass the grit test? (My answer: "nope") 3) If not, what syntax would be more appropriate? This is a general purpose feature akin to comprehensions (and, in fact, can be used in place of some annoyingly-verbose comprehensions). It needs to be easy to type and read. Pike's automap syntax is to subscript an array with [*], implying "subscript this with every possible value". It's great if you want to do just one simple thing: f(stuff[*]) # [f(x) for x in stuff] stuff[*][1] # [x[1] for x in stuff] but clunky for chained operations: (f(stuff[*])[*] * 3)[*] + 1 # [f(x) * 3 + 1 for x in stuff] That might not be a problem in Python, since you can always just use a comprehension if vectorized application doesn't suit you. I kinda like the idea, but the devil's in the details. ChrisA