On Thu, Oct 17, 2019 at 11:08 PM Anders Hovmöller

So how can you get a type error when doing

a*b

is the real question. And the answer is now obvious: any time the programmer thinks a and b are numbers but they are not.

If you start with the assumption that multiplication implies numbers, and then you see multiplication and it's not numbers, then yes, you may have problems. But why start with that assumption? Why, when you look at multiplication, should you therefore think that a and b are numbers? Instead, start with the assumption that MANY things can be added, multiplied, etc. Then it's not a logical type error to multiply strings, add dictionaries (if this proposal goes through), subtract timestamps, etc. It's just part of coding. And if you REALLY want early detection of logical type errors, use a type checker. ChrisA