
On Tue, Aug 24, 2021 at 4:31 PM Steven D'Aprano <steve@pearwood.info> wrote:
... midnight is not the annihilating element (zero).
Unless you're Cinderella, of course.
We conventionally represent clock times as numbers, but they're more akin to ordinal data. They have an order, but you can't do arithmetic on them.
Which puts them in a similar category to degrees Celsius - you can compare them, but there's no fundamental zero point. (It makes sense to say "what would the temperature be if it were 3° higher", but not "what is double this temperature" - not in Celsius.)
The fact is that what defines falsiness is use case dependent. Numbers are the best example, zero. A often be a perfectly meaningful number.
Fun fact: not only did the ancient Greek mathematicians not count zero as a number, but the Pythagoreans didn't count 1 as a number either.
https://www.britannica.com/topic/number-symbolism/Pythagoreanism
To be fair, that's more a question of "what is a number". If you have "a thing", is that "a number of things", or are they separate concepts? We have settled on the idea that "a thing" is a special case of "a number of things" mathematically, but try going to someone and saying "I have a number of wives" and seeing whether they think that 1 is a number. (And if zero is a number, then I also have a number of wives. Oh, and I have a number of pet cobras, too, so don't trespass on my property.)
Zero is a perfectly meaningful number, but it is special: it is the identity element for addition and subtraction, and the annihilating element for multiplication, and it has no inverse in the Reals. Even in number systems which allow division by zero, you still end up with weird shit like 1/0 = 2/0 = 3/0 ...
Numbers in general are useful concepts that help us with real-world problems, but it's really hard to pin them down. You can easily explain what "three apples" looks like, and you can show what "ten apples" looks like, and from that, you can intuit that the difference between them is the addition or removal of "seven apples". Generalizing that gives you a concept of numbers. But what is the operation that takes you from "three apples" to "three apples"? Well, obviously, it was removing zero elephants, how are you so dimwitted as to have not figured that out?!? In that sense, zero is very special, as it represents NOT doing anything, the LACK of a transition. (Negative numbers are a lot weirder. I have a cardboard box with three cats in it, and five cats climb out and run all over the room. Which clearly means that, if two cats go back into the box, it will be empty.) Division by zero has to be interpreted in a particular way. Calculus lets us look at nonsensical concepts like "instantaneous rate of change", which can be interpreted as the rise over the run where the run has zero length. In that sense, "dividing by zero" is really "find the limit of dividing smaller and smaller rises by their correspondingly smaller and smaller runs", and is just as meaningful as any other form of limit-based calculation (eg that 0.999999... is equal to 1). Mathematicians define "equal" and "divide" and "zero" etc in ways that are meaningful, useful, and not always intuitive. In programming, we get to do the same thing. So what IS zero? What does it mean? *IT DEPENDS*. Sometimes it's a basis point (like midnight, or ice water). Sometimes it's a scalar (like "zero meters"). Sometimes it indicates an absence. And that's why naively and blindly using programming concepts will inevitably trip you up. Oh, if only Python gave us a way to define our own data types with our own meanings, and then instruct the language in how to interpret them as "true" or "false".... that would solve all these problems..... ChrisA