# Arithmetic sequences in Python

Christoph Zwerschke cito at online.de
Mon Jan 23 02:30:50 CET 2006

```Alex Martelli wrote:
> Yep, using {} for both sets and dicts wouldn't be a good idea.  I
> suspect most core Python developers think of dicts as more fundamental
> than sets, so... (I may disagree, but I just don't care enough about
> such syntax sugar to consider even starting a debate about it on
> python-dev, particularly knowing it would fail anyway).

I'm still not convinced. At least I'd prefer {a,b,c} over any other
proposed solutions (http://wiki.python.org/moin/Python3%2e0Suggestions)
such as <a,b,c> or |a,b,c|.

You can argue that the notation for sets can be clumsy because they
aren't used so much as lists or dicts, but you can also argue the other
way around that sets aren't used much because the notation is clumsy
(and because they didn't exist from the beginning).

For instance, if sets had a simple notation, they could be used in more
cases, e.g. replace integer masks (see again
http://wiki.python.org/moin/Python3%2e0Suggestions):

pat = re.compile("some pattern", re.I|re.S|re.X)
would become
pat = re.compile("some pattern", {re.I, re.S, re.X})

> I don't agree that <typename>(<arguments>) is a clumsy notation, in
> general; rather, I consider "clumsy" much of the syntax sugar that is

If you really could write list(a,b,c) instead of list((a,b,c)) I do
somewhat agree.

> For example, making a shallow copy of a list L
> with L[:] is what strikes me as clumsy -- list(L) is SO much better.

Certainly.

> And I vastly prefer dict(a=1,b=2) over the clumsy {'a':1, 'b':2}.

Ok, but only as long as you have decent keys...

> I suspect I'm unusual in that being deeply familiar with some notation
> and perfectly used to it does NOT necessarily make me LIKE that
> notation, nor does it make me any less disposed to critical reappraisal
> of it -- the brains of most people do appear to equate habit with
> appreciation.  In the light of my continuous and unceasing critical
> reappraisal of Python's syntax choices, I am quite convinced that many
> of them are really brilliant -- with the "display forms" of some
> built-in types being one area where I find an unusually high density of
> non-brilliance, AKA clumsiness.  But, that's just me.

Ordinary people are lazy. If we have learned something and got
accustomed to it, we don't want to relearn. It is inconvenient. And
there is this attitude: "If I had so much trouble learning a clumsy
notation, why should future generations have it easier."

And in programming languages, you also have the downward compatibility
problem. Having to change all your old programs makes people even more