Augmented Assignement (was: Re: PEP scepticism)

Alex Martelli aleaxit at yahoo.com
Mon Jul 2 16:25:33 EDT 2001


"Paul Prescod" <paulp at ActiveState.com> wrote in message
news:mailman.994095972.11214.python-list at python.org...
    ...
> > Depending on how you define "real world", I have some use of that
> > with my still-unreleased "in-house" (literally:-) "gmpy 1.0", where I
> > have mutable numbers.
>
> That's pretty weird but I have no objection to consenting adults doing
> weird things. I would suggest you consider lightweight proxies instead
> of mutable numbers so that you get optimization for x = x + y "for
> free".

Care to expand on this...?  Say that x is (a lightweight proxy holding)
a Python long of a few hundred digits which is counting the number
of occurrences of some event-classification in a space of events that
is pretty huge because of combinatorial-explosion.  y is another count
I have to add to x, typically a few hundred thousands or a few millions
at most.  How would x and y being lightweight proxies save me any
computational effort?  Won't Python still have to allocate another chunk
of a few hundred digits to store the new value of x, since the old one
is immutable?  With GMP (exposing its innards as mutable numbers)
I save the allocation (except when x becomes "longer", which happens
rarely enough that I don't worry about it) by doing in-place addition
(GMP exposes functions with 3 arguments, one destination and two
sources, and allows the destination to have the same identity as either
source OR to be another new mpz-object).  Are you suggesting I should
implement '+' lazily (by chunking or whatever)?  I fail to see what it
would buy me -- that 'x' is going to be incremented many times.  I
am quite a fan of lightweight proxies but here I can't see any speed
benefit they could bring me.  So what am I missing...?


> But let me try another tact: what if we just changed the behavior of
> lists. Rather than saying "mutable objects do this, immutable objects do
> that", I would like to say "built-in, ordinary, typical objects do this,
> rare, advanced, specialized objects do that."
>
> "This" is the simplest interpretation x +=y is equivalent to x = x + y.
>
> "That" is the more sophisticated polymorphic interpretation where the
> type chooses its own behaviour.
>
> Would this be an acceptable compromise? Everyone wins.

I don't understand whether you're arguing in the abstract, or concretely
proposing to break any Python program, written in the last year or so,
which decided to code "mylist += somemore" as opposed to coding
"mylist.extend(somemore)" while some other reference to mylist may
be outstanding.  If the latter, I don't see how such code breakage may
be "a compromise".  In the abstract, if you were designing a new language
from scratch and wanted to have both mutable and immutable datatypes
in it, I guess you could draw the line for a switchover in +='s behavior at
different thresholds; mutable vs immutable is very easy to draw, 'ordinary'
vs 'advanced' may not be, but maybe in such a hypothetical language
you might have other ways to easily flag ordinary vs advanced data
types so people cannot get confused.  It need not be Python, but it
might be an interesting thought-experiment, I guess.


Alex






More information about the Python-list mailing list