anything like C++ references?

Stephen Horne intentionally at blank.co.uk
Tue Jul 15 16:56:49 EDT 2003


On 15 Jul 2003 10:19:18 GMT, kamikaze at kuoi.asui.uidaho.edu (Mark
'Kamikaze' Hughes) wrote:

>> Really. My computer science lessons were taught in a way that
>> respected the source of most of the theory in mathematics -
>> algorithms, Church's lambda calculus and that kind of stuff.
>> What did your lessons teach?
>
>  How computer programming actually works, how to make interpreted and
>compiled languages, how to use them to solve problems, how to analyze
>algorithms, etc....  The usual.

You say that as if it contradicts the idea of respecting the source of
the theory. It doesn't.

>  Among those were classes in languages
>other than C, which is very broadening, and I highly recommend that to
>you.

Lets see.

Probably about a dozen basics, five assembler languages (6502, z80,
68000 series, 8086 series, 80C196KC microcontroller), forth, lisp,
prolog, miranda, haskell, icon, ada, pascal, modula 2, cobol,
smalltalk. Not all of them very seriously, though, and in many cases
not recently.

I forget the name of that language designed for transputers, but I
used that a bit at college too - and depending on your inclination you
may want to include SQL. And yes, I have used C, C++ and Java as well.

That's just off the top of my head, of course.

But there's no way you can call me C centric. That has been a repeated
accusation, but it is WRONG.

Python has been an everyday language for me LONGER than C++ has.

>> Respect the idea of variables binding to values and suddenly the need
>> for pointers becomes more obvious. You cannot abuse mutable objects to
>> fake pointer functionality (another everyday fact of Python
>> programming) if the binding of variables to values (rather than just
>> objects) is respected.
>
>  If you're trying to say that the trick I showed you of modifying a
>list to reproduce the effect of reference arguments is not possible in,
>say, C, you're wrong.  Trivially proven wrong, even, since Python is
>implemented in C.  In fact, the only languages where that won't work are
>those which don't allow passing complex arguments at all; some
>pure-functional toy languages, perhaps.

Distorting what I said, again. I have already repeatedly said that the
point about C is what happens by default - not what happens when you
explicitly request otherwise.

Lets take an example from C++ to illustrate the point.

  std::string x = "123";
  std::string y = x;

  //  At this point, in most implementations, x and y are bound to the
  //  same object due to a lazy copy optimisation. However, the
  //  programmer didn't request this optimisation so this
  //  implementation detail is expected to be transparent to the user.
  //  Therefore...

  y [2] = 4;

  //  Before doing this operation, a copy was made - the so called
  //  copy on write.
  //  The value bound to x is still "123"
  //  The value bound to y is now "124"

C++ is using references here just as Python does, but despite being a
crappy low level language it still manages to implement that without
causing remote side-effects through mutability *unless* you explicitly
request that.

>  If you're seriously demanding that all languages adhere to a religious
>observance of object and variable purity that exists in no serious
>programming language, you might want to stop and consider *why* nobody
>"respects" that.

Inertia? Or maybe no-one has really used many languages other than
perl and python, or maybe lisp on occasion.

Oh dear - am I jumping to unfair conclusions about peoples levels of
experience? I wonder where I got that idea from.

>If you want Python to change to obey
>your ideals, well, you're unlikely to effect that change;

I said long ago that this was a philosophical point. That doesn't mean
I'm going to sit down and take it when everyone says I'm wrong -
except when I am, of course. To date, I am the only person
participating in this thread to admit any error or to appologise for
anything.

What does that tell you?

I said an age ago that I wasn't expecting a change. I stated that
while python is not perfect, it is damn close - and that it is my
first choice language whenever I have a choice. And I also said that
if the ideas I've been discussing were implemented, Python would be so
different as to no longer be Python.

I don't need to believe that any language is absolutely perfect, you
see. I can discuss a possible flaw in even my favorite language
without feeling threatened or insecure. I just don't like it when
people contradict me, unless I recognise that contradiction as valid.
Sometimes I'm boneheaded about that - but when I realise my mistake I
admit it and appologise.

Many of the arguments used against me in this thread have directly
contradicted each other. In such cases, both people cannot be right.

So show me *one* admission of error by anyone other than me.

>Perhaps you should write your own language, if no others are
>acceptable to you.

Unlikely to be serious, but I've been thinking about it a while. A
decent copy-on-write system in a scripting language may just be a
handy differentiator.

FYI - I have written programming languages before. One currently in
use in a commercial product, though I still regard it as a toy. I have
also written several parser generators of my own - one python one
using (mostly) recursive descent, one using LR(1) and - because Python
is a tad slow for LR(1) of a decent size grammar (dependent on how
patient you are, of course) an LR(1) generator written in C++.

Creating a new language is not beyond my ability, but this is beside
the point. I never said python is crap or anything like that. That was
read into my words. I was discussing a weakness. There is no such
thing as a perfect language. Creating a new language as anything more
than an experimental exercise would be a massive waste of time and
effort - and I'd have a lot of man-hours of catching up to do.





More information about the Python-list mailing list