Pass by reference ?
Robert W. Cunningham
rcunning at acm.org
Wed Apr 5 20:21:36 CEST 2000
Jacek Generowicz wrote:
> Michael Hudson wrote:
> > The code you posted only confuses if you have a flawed understanding of
> > assignment in Python
> That is exactly the point. My original confusion arose precisely because I had
> made an incorrect assumption about assignment in Python (an assumption which, I
> suspect, many newcomers to the language will make until they learn the truth).
> Correcting this error is far more constructive than discussing what name should be
> given to the semantics of argument passing.
IMHO, they are, ultimately, one and the same. It is useful to be able to discuss
Python assignment and passing semantics and syntax with those who know nothing about,
or have no interest in, Python itself, or are so new to the language that the only
common frame of reference is either academic (formal, theoretical) or via analogies
to other languages (informal, practical). The abstractions are important, and thus,
so is the nomenclature (if the abstractions are to be understood and shared). The
notion of "side effects" in general, as related to passing and assignment, is a
fairly high-level issue with very low-level implications.
So clearing up the terminology gets us all on the same page, so to speak, and thus
allows us to discuss Python specifics in the context of theory AND in comparison to
In particular, living under "an assumption which, I suspect, many newcomers to the
language will make until they learn the truth" is EXACTLY the attitude I wish to
avoid. If it is true, it should not require learning the entire language to
UNDERSTAND it. It does, however, require learning the language to USE it. It seems
obvious to me that allowing understanding to PRECEDE use is not a bad way to go,
since it allows you to build on existing knowledge (all that theory from college),
rather than be forced to start from zero.
The CS models of "Pass By Reference" and "Pass By Value" are fairly well understood,
but they do not seem to map simply and directly to Python. Rather than say "Give up
on theory, just learn Python", I'd rather EXTEND (or clarify) the theory to account
for Python! Then, I'm certain, I will learn Python better and faster.
It is the ability to make generalizations that distinguishes those who learn quickly
("I get the general idea, now lets focus on the details") from those who can't ("Just
give me enough examples and a good interpreter and debugger, and I'll get it
eventually"). The abstractions matter, especially when they can be mapped to
accepted theory. If for no other reason, theory is important simply because it
allows us to form generalizations that permit previously learned knowledge to be
employed (remapped) in the effort to learn a new language.
There is not the "World Of All Other Computing Languages" and "Python" off by
itself. It is a continuum that has a large and powerful body of theory and practice
behind it. It makes sense to show how Python is part of that continuum before just
saying something self-limiting, like "Python is Different - Get used it". Or that
newcomers should wait "until they learn the truth".
Those sentiments may apply to Perl <g>, but certainly not to Python!
More information about the Python-list