# Devanagari int literals [was Re: Should non-security 2.7 bugs be fixed?]

Steven D'Aprano steve+comp.lang.python at pearwood.info
Tue Jul 21 12:39:44 CEST 2015

```On Tuesday 21 July 2015 19:10, Marko Rauhamaa wrote:

> This is getting deep.

When things get too deep, stop digging.

> It is an embarrassing metamathematical fact that
> numbers cannot be defined. At least, mathematicians gave up trying a
> century ago.

That's not the case. It's not so much that they stopped trying (implying
failure), but that they succeeded, for some definition of success (see
below).

The contemporary standard approach is from Zermelo-Fraenkel set theory:
define 0 as the empty set, and the successor to n as the union of n and the
set containing n:

0 = {} (the empty set)
n + 1 = n ∪ {n}

https://en.wikipedia.org/wiki/Set-theoretic_definition_of_natural_numbers

> Our ancestors defined the fingers (or digits) as "the set of numbers."
> Modern mathematicians have managed to enhance the definition
> quantitatively but not qualitatively.

So what?

This is not a problem for the use of numbers in science, engineering or
mathematics (including computer science, which may be considered a branch of
all three). There may be still a few holdouts who hope that Gödel is wrong
and Russell's dream of being able to define all of mathematics in terms of
logic can be resurrected, but everyone else has moved on, and don't consider
it to be "an embarrassment" any more than it is an embarrassment that all of
philosophy collapses in a heap when faced with solipsism.

We have no reason to expect that the natural numbers are anything less than
"absolutely fundamental and irreducible" (as the Wikipedia article above
puts it). It's remarkable that we can reduce all of mathematics to
essentially a single axiom: the concept of the set.

--
Steve

```