why cannot assign to function call
mdw at distorted.org.uk
Sat Jan 10 00:40:07 CET 2009
rurpy at yahoo.com <rurpy at yahoo.com> wrote:
> If one accepts that there are a "lot" of people who post in here that
> clearly are surprised by Python's assignment semantics,
But one should not accept that. One might accept that there are many
who post who claim that they are surprised by Python's assignment
semantics. Such people are wrong, however, since what they are
surprised by is Python's data model, and one reason that they are
surprised by Python's data model is because it's not actually explained
> and further appear to expect assignment to have copy-like semantics,
This expectation is justified and, indeed, satisfied. Python does, most
definitely, copy on assignment. What it copies is references, however.
This might be clearer if the data model were explained better.
As an aside, I don't notice anywhere near as much confusion in Lisp and
Scheme groups, which might be surprising since Lisp and Scheme have
precisely the same data model, argument passing convention, and
assignment semantics, as Python has. There are many possible
* The Lisp and Scheme communities are smaller. This is certainly
true. But it wouldn't explain what appears to be a disproportionate
level of confusion on the topic among Python beginners.
* Individuals in the Lisp and Scheme communities are cleverer and/or
more widely experienced. One might make an argument that this is
true and a result of the relative community sizes -- basically a
result of self-selection. But instead I'll reject this as an
explanation. It's arrogant and unproven.
* The Lisp and Scheme communities make a concerted effort to explain
their data model clearly and precisely. They accept that it's
actually quite complicated and, rather than pretend that it isn't,
explain the complexity and the benefits it brings that make the
complexity worthwhile. I think this is the likely one.
> then where is that expectation coming from?
> How would anyone develop that expectation if (from a different post in
> this thread), "[Python's] idea of assignment is the same as anyone
Because they've fundamentally misunderstood the data model. The very
fact that their confusion is ascribed to the wrong thing is strongly
indicative of this.
> If you maintain that reference-like assignment is very common and
> something every programmer is accustomed to, then where are they
> getting the copy-like assignment expectations from?
But it's not just assignment that deals with references. It's argument
passing and storage of compound data as well. (See PLR 3.1.)
They expect that assignment copies stuff, because that's what assignment
does. Everywhere that I can think of -- except C++, which leaves
assignment semantics in hands of the programmer. What they're confused
about is what, precisely, it is that gets copied. And that, really, is
a result of an inadequate understanding of the data model.
> I agree that most of the time, when one is using large (memory)
> composite "objects", and one needs to pass, or access them by
> different names, one will often use references to do so in order to
> avoid expensive copies or to get desired "shared" behavior. But (with
> the exception of C arrays [*1]), doing so requires some special syntax
> in all the languages I mentioned (AFAIK).
Ummm... you mentioned C, C++, `Python, Java, REALbasic, .NET'.
Well, C we've dealt with. C++ is weird. Python we all know, and is the
main subject of the argument. REALbasic I don't know at all, but BASICs
traditionally represent data fairly directly (rather than via
references) so will largely be like C. .NET isn't a language at all:
rather, it's a virtual machine, runtime system, class library and family
of languages each of which may have idiosyncratic semantics.
Which leaves Java. Java divides the world into `primitive' and
`reference' types (4.1). The former are represented directly; the
latter have a pointer to the true data as immediate representation. But
observe that Java's primitive types are integer types (including its
misnamed `char'), floating-point types, and booleans. Equivalent
objects of all of these are immutable in Python -- and there is no
observable difference (unless exposed by an operator like `is' or the
`id' built-in) between a directly represented object and an /immutable/
object represented by reference. (I've no doubt that the original
language designers were aware of this equivalence, so it's a mystery to
me precisely why they specified the language as they did rather than
leaving the messy business of boxing and unboxing primitive values to
the compiler, which, given Java's mandatory type annotations, should
have found the job trivial.)
> So it still seems to me that this is a likely explanation to why there
> is frequent misunderstanding of Python's assignments, and why
> responding to such misunderstandings with, "Python's assignments are
> the same as other languages'", is at best not helpful.
That's why I'm not just saying that assignment is the same. I'm also
saying that the data model is most definitely not the same as C.
> I have often wished that C handled arrays the same way it does
> structs. I am sure that the pointer-array pseudo-equivalence seemed
> like a very clever idea at the time but I wonder if Dennis Richie ever
> had second thoughts about it.
Probably. But I think the idea was actually inherited from BCPL, via B.
In BCPL, memory is divided into words. Depending on which operator you
use, you can treat a particular word as an integer, a floating-point
number, or a pointer. An array in BCPL is represented as a pointer to
its first element -- always -- and you index it by an expression of the
form p!i (a notation inherited by BBC BASIC and Haskell of all things).
The array->pointer decay is a compromise position between the BCPL
notion of array-as-pointer and the desire to allocate such things with
automatic storage duration and have sizeof and so on work properly.
More information about the Python-list