On Thu, Feb 7, 2019 at 4:27 PM David Mertz <mertz@gnosis.cx> wrote:

> Actually, if I wanted an operator, I think that @ is more intuitive than extra dots.  Vectorization isn't matrix multiplication, but they are sort of in the same ballpark, so the iconography is not ruined.

well, vectorization is kinda the *opposite* of matrix multiplication -- matrix multiplication is treating the matrix as a whole, rther than applying multiplication to each element. And it is certainly the opposite in the numpy case.

Which gives me an idea -- we could make an object that applied operators (and methods??) to each element individually, and use the @ operator when you wanted the method to act on the whole object instead.

Note: I haven't thought about the details at all -- may not be practical to use an operator for that.

>    (Vec(seq) * 2).name.upper()

> Or:

>    vec_seq = Vector(seq)
>    (vec_seq * 2).name.upper()
>    # ... bunch more stuff
>    seq = vec_seq.unwrap()

what type would .unwrap() return?

One of the strengths of the "operator" approach is that is could apply to any (appropriately mutable) sequence and keep that sequence. I"m not sure how much that actually matters, as I'm expecting this is a 99% list case anyway.

and why would .unwrap() be required at all -- as opposed to say:

seq = list(vec_seq)

> I'm not saying the double dots are terrible, but they don't read *better* than wrapping (and optionally unwrapping) to me.

nor to me.

> Well... your maps are kinda deliberately ugly.

That's actually pretty key -- in fact, if you wanted to apply a handful of operations to each item in a sequence, you would probably use a single expression (If possible) in a lambda in a map, or in a comprehension, rather than chaining the map.

Even if it was more complex, you could write a function, and then apply that with a map or comprehension.

In the numpy case, compare:

c = sqrt(a**2 + b**2)

to

c = [sqrt(a**2 + b**2) for a,b in zip(a,b)]

so still a single comprehension. But:

1) given the familiariy of math expressions -- the first really does read a LOT better
2) the first version can be better optimized (by numpy)

So the questions becomes:

* For other than math with numbers (which we have numpy for), are there use cases where we'd really get that much extra clarity?

* Could we better optimize, say, a sequence of strings enough to make it all worth it?

-CHB


--
Christopher Barker, PhD

Python Language Consulting
  - Teaching
  - Scientific Software Development
  - Desktop GUI and Web Development
  - wxPython, numpy, scipy, Cython