tough-to-explain Python

Hendrik van Rooyen mail at
Sat Jul 11 14:01:25 CEST 2009

"Steven D'Aprano" <steve at> wrote:

>On Fri, 10 Jul 2009 12:54:21 +0200, Hendrik van Rooyen wrote:
>> "Steven D'Aprano" <steve at> wrote:
>>>On Wed, 08 Jul 2009 22:05:57 -0700, Simon Forman wrote:
>>>>> persistent idea "out there" that programming is a very accessible
>>>>> skill, like cooking or gardening, anyone can do it, and even profit
>>>>> from it, monetarily or otherwise, etc., and to some extent I am
>>>> Programming is not like any other human activity.
>>>In practice? In principle? Programming in principle is not the same as
>>>it is performed in practice.
>>>But in either case, programming requires both the logical reasoning of
>>>mathematics and the creativity of the arts. Funnily enough,
>> I do not buy this arty creativity stuff. - or are you talking about
>> making a website look pretty?
>I must admit, it never crossed my mind that anyone here would claim that 
>there was no creativity involved in programming, that it was all a 
>mindless, algorithmic process capable of being done by a simple 
>mechanical device.

"Programming" is the step of going from the "design" to something
that tells the machine how to implement the design.

The creativity could, arguably, be in the "Design".
Not in the translation to python, or assembler.
No way.  That is just coding.

>This is certainly the accusation made against *bad* programmers -- that 
>they can't actually solve new, unique problems, but just apply recipes 
>they learned without any insight or intelligence. The sort of people who 
>program so poorly that "a trained monkey could do what they do".
>Do you really think that applies to good programmers too? If so, then a 
>good code generator should be able to replace any programmer. Is that 
>what you believe?

Should eventually be possible, with sufficient restrictions to start off.
UML wants to go this route...

But may my eyes be stopped and my bones be heaped with dust
ere I see the day...

>>>mathematicians will tell you that mathematics requires the same, and so
>>>will the best artists. I think mathematicians, engineers, artists, even
>>>great chefs, will pour scorn on your claim that programming is not like
>>>any other human activity.
>> So a chef is now an authority on programming?
>Did I say that?

No.  I just read it like that to irritate you.

>Chefs are authorities on OTHER HUMAN ACTIVITIES.
>> Programming is actually kind of different - almost everything else is
>> just done, at the time that you do it.
>> Programming is creating stuff that is completely useless until it is fed
>> into something that uses it, to do something else, in conjuction with
>> the thing it is fed into, at a later time.
>Somebody should teach Hendrik that human beings have been creating TOOLS 
>for hundreds of thousands of years. People have been creating tools to 
>build tools for thousands of years. Software is just more of the same.

I disagree - I actually own some machine tools, so I am a little
bit acquainted with what they can do, and it is not at all like computing
at any level I can think of - they are merely extensions of the hand, making
the transformation of materials more accurate and faster.

The line only becomes blurred when a processor is added, and a STORED
PROGRAM is brought into the equation.

>Even *soup stock* fits the same profile as what Hendrik claims is almost 
>unique to programming. On its own, soup stock is totally useless. But you 
>make it, now, so you can you feed it into something else later on.
>Or instant coffee.
>No, Henrik, if that's the best you can do, it's not very good. It is 
>rather sad, but also hilarious, that the most different thing you have 
>noticed about software is that it's just like instant coffee.

You have a wonderful ability to grab hold of part of a definition
and to ignore the rest, just like I can misread what you write.

Coffee and soup stay coffee and soup on re hydration.  Mixing it
in with something else is not at all the same - it does not "DO" anything
else in conjunction with the thing it is fed into - how is that like 
programming, and executing a program?

I am sorry if you are confusing the drinking of coffee,
which is an ancilliary activity to programming, with the
actual programming itself.

>> This is a highly significant difference, IMHO.
>>>> He talks about how "when all is said and done, the only thing
>>>> computers can do for us is to manipulate symbols and produce results
>>>> of such manipulations" and he emphasises the "uninterpreted" nature of
>>>> mechanical symbol manipulation, i.e. that the machine is doing it
>>>> mindlessly.
>>>"Manipulate symbols" is so abstract as to be pointless. By that
>>>reasoning, I can build a "computer" consisting of a box open at the top.
>>>I represent a symbol by an object (say, a helium-filled balloon, or a
>>>stone), instead of a pattern of bits. I manipulate the symbol by holding
>>>the object over the box and letting go. If it flies up into the sky,
>>>that represents the symbol "Love is War", if it falls into the box, it
>>>represents the symbol "Strength is Blue", and if it just floats there,
>>>it represents "Cheddar Cheese". This is a deterministic, analog computer
>>>which manipulates symbols. Great.
>>>And utterly, utterly useless. So what is my computer lacking that real
>>>computers have? When you have answered that question, you'll see why
>>>Dijkstra's claim is under-specified.
>> So if computers do not manipulate symbols, what is it that they do?
>Did I say they don't manipulate symbols?

No you wanted someone to tell you how to make your
box machine useful, and that was too difficult, so I went this way.

>> They
>> sure cannot think,
>> or drink,
>> or reason,
>They can't reason? Then what are they doing when they manipulate symbols?

They simply manipulate symbols.
They really do.
There is no reasoning involved.

Any reasoning that is done, was done in the mind of the human who designed
the system, and the the machine simply manipulates the implementation of
the abstract symbols, following the implementation of the abstract rules that
were laid down.  No reasoning at run time at all.

Lots of decision making though - and it is this ability to do this now, and
something else later, that tricks the casual observer into thinking that there
is somebody at home - a reasoning ghost in the machine.

>Yet again, it didn't even cross my mind that somebody would make this 
>claim. My entire point is that it's not enough to just "manipulate 
>symbols", you have to manipulate symbols the correct way, following laws 
>of logic, so that the computer can *mindlessly* reason.


There is no requirement for dragging in things like "the laws of logic"
or any arbitrarily chosen subset.  You can build a machine to do
essentially "anything" - and people like Turing and his ilk have
spent a lot of skull sweat to try to define what is "general purpose",
to enable you to do "anything"

>> or almost any verb you can think of.
>If you're going to take that argument, then I'll remind you that there 
>are no symbols inside a computer. There are only bits. And in fact, there 
>aren't even any bits -- there are only analog voltages, and analog 
>magnetic fields.

Duh - sorry to hear that - I have only been doing digital electronics
now for more years than I care to remember - but maybe there
is a difference in how you define "analogue". - maybe something like:
"if I can measure it with a multimeter, it is an analogue signal, because
a multimeter is capable of measuring analogue signals"  - a bit fallacious.

More seriously,  I again differ on the symbol bit - I am, right now, 
working on a little processor that mindlessly does I/O, by manipulating 
bits from one representation to another.  In this case, the symbols
and the bits are mostly identical - but please do not try to tell
me that there are no symbols inside the little processors' 
memory - I know they are there, because I have arranged for 
them to be put there, via an RS-232 feed.

And my stored program actually arranges for the incoming symbols
to do things like activating output relays, and changes on input lines
actually cause other symbols to be transmitted out of the RS-232
port.  The whole thing is simply rotten with symbology - in fact all
it does, is manipulating the implementation of the abstract symbols 
in my mind.
It does so mindlessly.
This specific thing is so simple, that it hardly uses the
"logical ability" of the processor.  And please remember that
for a function with two input bits and a single output bit, there can 
at most be eight possible outcomes, not all of which are useful, for
some definition of useful.  But they exist, and you can use them, if 
you want. And you do not even have to be consistent, if you do not
want to be, or if it is convenient not to be.

I think it is this horrendous freedom that Dijkstra was talking about
when he was referring to the "power beyond your wildest dreams",
or some such.

>> "Manipulating symbols" is actually an elegant definition. Try coming up
>> with a better one and you will see.
>As I said above, it's not enough to just manipulate symbols. Here's a set 
>of rules to manipulate symbols:
>X => 0
>or in English, "Any symbol becomes the zero symbol".
>That's symbol manipulation. Utterly useless. This is why it's not enough 
>to just manipulate symbols, you have to manipulate them THE RIGHT WAY.

There is no RIGHT WAY.
There are only ways that are useful, or convenient.

In my little processor, I can cause any result
to occur, and the rules can be as silly as your
example - if it is useful for you, under some
circumstances, for all symbols to become the
symbol for zero, then go for it - The point is that 
there is nothing prohibiting it to be adopted as a
rule for a machine to implement.  

/dev/null anyone?

- Hendrik

More information about the Python-list mailing list