Python is readable

Chris Angelico rosuav at gmail.com
Mon Mar 19 05:33:51 CET 2012


On Mon, Mar 19, 2012 at 12:23 PM, Steven D'Aprano
<steve+comp.lang.python at pearwood.info> wrote:
> On Mon, 19 Mar 2012 09:02:06 +1100, Chris Angelico wrote:
>
>> On Mon, Mar 19, 2012 at 8:30 AM, John Ladasky <ladasky at my-deja.com>
>> wrote:
>>> What I would say is that, when PROGRAMMERS look at Python code for the
>>> first time, they will understand what it does more readily than they
>>> would understand other unfamiliar programming languages.  That has
>>> value.
>>
>> This is something that's never truly defined.
>
> I'm sorry, I don't understand what part of John's sentence you mean by
> "this". "Programmers"? "Value"? "Understand"?

I should have rewritten that into the next paragraph. Anyhow. Further
explanation below.

>> Everyone talks of how this
>> language or that language is readable, but if you mean that you can look
>> at a line of code and know what *that line* does then Python suffers
>> badly and assembly language wins out;
>
> This is at least the second time you've alleged that assembly language is
> more readable than Python. I think you're a raving nutter, no offence
> Chris :-)

None taken; guilty as charged. And unashamedly so. With that dealt
with, though: My calling assembly "readable" is a form of argument by
drawing to logical, but absurd, conclusion - by the given definition
of readability, assembly is readable, ergo the definition sucks.
(That's a term all logicians use, you know. Proper and formal jargon.)

> Assignment (name binding) is close to the absolute simplest thing you can
> do in a programming language. In Python, the syntax is intuitive to
> anyone who has gone through high school, or possibly even primary school,
> and been introduced to the equals sign.
>
> x = 1234
> y = "Hello"

Not quite. In mathematics, "x = 1234" is either a declaration of fact,
or a statement that can be either true or false. In mathematics, "x =
x + 1" is absurd and/or simply false. That's why Pascal has its :=
operator, supposed to be read as "becomes" and not "equals". IMHO this
is simply proof of one of the differences between programming and
mathematics.

> I don't know about anyone else, but I wouldn't have guessed that the way
> to get x=1234 was with "x dw 1234".

Except that it's not quite the same thing. That 8086 Assembly
statement is more like the C statement:
int x=1234;
The nearest equivalent of assignment is:
mov x,1234
although that differs slightly from assembler to assembler (NASM, if I
recall correctly, calls for "mov [x],1234"). You're right, though,
that it's nothing like as clear.

> What's more, with Python, one example hints on how to do other examples.
> Having seen x=1234, most people would guess that x=1.234 would also work.

Yes, which brings up the old favorite arguments about whether
computers work with integers, floats, rationals, Real Numbers, or
Numbers. But, that aside...

> I'm pretty sure that "x dw 1.234" will do something surprising, although
> I don't know what, but it certainly won't be to assign the float 1.234 to
> a variable x.

Perhaps not usefully, but "x dd 1.234" ought to work. You just can't
fit a float into a dw. (NASM does support "half precision", but most
operations want a doubleword for a float.) Assembly language requires
variable sizes to be declared.

Of course, what it *really* does is declare a doubleword-sized patch
of memory, initializes it to the IEEE representation of the nearest
possible float to the real number 1.234, and assigns the label 'x' to
point to the lowest memory address used by that doubleword... but
mostly you don't need to care about that.

> So there we have another measure of "readability" -- discoverability.
> Having seen one example, how easy is it to predict slightly modified
> examples?
>
> In assembly's case, not very predictable. What's the difference between
> these two lines?
>
> a db 127
> b db 327
>
> For that matter, what will the second line actually do?

I'm not 100% sure, but since assembly is much stricter than most HLLs
with data sizes, it's a concern that you don't have with Python's long
ints. But the same thing can come up in a HLL - struct.pack("i") can't
handle a long int, because, like an assembler, it's concerned about
exact byte sizes.

> Assembly has a very steep learning curve. Python has a shallow curve.

Of course. But computer programming generally has a fairly stiff
learning curve; you have to get your head around all sorts of concepts
that simply do not exist anywhere else.

> Here's another indication of readability. There have been occasional
> suggestions on Wikipedia that they standardize on Python as "executable
> pseudo-code" for code samples. Now replace "Python" with assembly
> language. Unless you're Donald Knuth, the idea is ridiculous.

I am not, and it is. But I could make you a set of NASM macros that
allow you to write assembly code that looks like pseudo-code. Does
that make assembly code readable? Maybe. Does it make it worth using
as a HLL? No.

> If all you have is a hammer, the instructions you get are easy to
> understand because there's not much you can do with it. How complicated
> can "hit the nail with the hammer" get? But the right measure is not the
> simplicity of the tasks you *can* do, but the comprehensibility of the
> tasks you *want* to do.

Right, which is why NO language can be described as "readable" or
"easy to work with" unless you define a problem domain. SQL is an
excellent language... as long as you want to query a relational
database.

> The measure of the readability of a language should not be obfuscated or
> badly written code, but good, idiomatic code written by an experienced
> practitioner in the art. Note that there is a deliberate asymmetry there:
> when judging "readability" (comprehensibility), we take idiomatic code
> written by somebody who knows the language well, and give it to a novice
> to interpret it.

Sure. You can write bad code in any language, and that doesn't mean
anything. But still, if you're writing for a novice, you write quite
different code from what you'd write normally. Language tutorials are
written by experts for novices; language tutorials do not look like
the language's own standard library (assuming it has one written in
itself).

>> Really, the metric MUST be Python programmers. Intuitiveness is of
>> value, but readability among experienced programmers is far more useful.
>
> But by that metric, Brainf*** is readable, since an experienced expert in
> the art of writing BF code will be able to recognise BF idioms and
> interpret them correctly.

Not really. The BF code to do one simple high level operation could
easily span several pages. You can't "recognize" those. Now, BF with
macros/subroutines might be more plausible - if you could define a new
opcode that represents some huge slab of BF code and use that - but in
its native form, I don't think anyone could recognize what BF is
doing.

> No, I'm sorry, I disagree that the standard of readability should be the
> experienced programmer. By that standard, "readability" is a no-op. All
> languages are, more or less, equally readable, to those who can read them
> well. I don't think it is useful to judge the readability of Forth on the
> ability of Charles Moore to read it. How does that help me decide whether
> to use Forth for my next project?
>
>> If I write a whole lot of code today, and next year I'm dead and someone
>> else has replaced me, I frankly don't mind if he has to learn the
>> language before he can grok my code. I _do_ mind if, even after he's
>> learned the language, he can't figure out what my code's doing;
>
> Then he hasn't learned the language *sufficiently well*. Either that, or
> your code is obfuscated, unidiomatic crap. Perhaps you're trying to write
> BF in Python :)

And that's where the nub of the question is. How well is sufficiently
well? Clearly you do not require your code to be comprehensible to a
non-programmer, or you would not write code at all. If you don't
demand that the reader learn basic keywords of the language, then it's
equally impossible to expect them to comprehend your code. If you're
writing production code, I see no reason to avoid language features
like Python's list comps, Pike's %{ %}  sprintf codes, or C's pointer
arithmetic, just because they can confuse people. Learn the language,
THEN start hacking on the code.

ChrisA



More information about the Python-list mailing list