[Python-Dev] The new and improved PEP 572, same great taste with 75% less complexity!

Larry Hastings larry at hastings.org
Thu Apr 26 20:26:55 EDT 2018



On 04/26/2018 12:12 PM, Tim Peters wrote:
> [Larry Hastings <larry at hastings.org>]
>> I hate to be pedantic--there's enough of that going on in this thread--but I
>> can't agree with the word "simplifed" above.  I agree that the code using
>> binding expressions is shorter.  But considering that emit the two code
>> examples implement the exact same algorithm, to the point where their
>> bytecode would look nearly* identical, ISTM that the two code examples are
>> of identical complexity.
> In the absence of defining an objectively computable complexity
> measure,  I expect you're doomed to arguing taste.

As are you!  I haven't seen any arguments that binding expressions allow 
us to express programs that were inexpressible in Python before.  I'm 
not even sure that binding expressions fall under the heading of 
"syntactic sugar", given their negligible semantics (and, imo, 
negligible benefit).  What else is left, on /both/ sides of the debate, 
if not a debate over aesthetics?


>    For example, argue
> that both spellings have the same formal "cyclomatic complexity"
> measure (which they do).  By other formal measures (e.g., total number
> of identifier instances), the latter spelling is "objectively
> simpler".  By yet others (e.g., total number of non-whitespace
> characters divided by total number of lines), the former spelling is
> "objectively simpler".

What is this "objective simplicity" measurement you cite?  I understand 
that the code example cited had fewer identifiers, so when measuring 
"number of identifiers used" in isolation, the code example using 
binding expressions had fewer of them.  But this is so narrow as to be 
almost meaningless.

Perhaps I'm misunderstanding you, but I read this as saying that there's 
a larger, well-established concept called "objective simplicity", of 
which this measurement is a part.  Can you tell me more about it?  
Google was no help here.


> But that all kinda misses the point to me:  the latter spelling is
> "obviously simpler" in a way that _actually matters_, for the same
> reason, e.g., a case statement with N cases is "obviously simpler"
> than the semantically equivalent spelling using N nested if/else
> if/else if/else if/else ... blocks.

As I already mentioned, the with-binding-expressions code expresses the 
same code, the same concept, and likely results in the same bytecode, as 
the without-binding-expressions code.  In contrast, a switch statement 
/is/ simpler than a series of nested if statements.  It's a different 
code construct, it has different (and comparatively restricted) 
semantics, and it results in simpler (and faster) code.  Whereas the 
with-binding-expressions code is equivalent to the 
without-binding-expressions code, semantically, bytecode-ly, etc.  So 
comparing the with-binding-expressions version of the code to the 
simplification afforded by a switch statement isn't an apples-to-apples 
comparison.

In other words: you're really only arguing taste here.  You find it 
"obviously simpler", but this an aesthetic call on your part and not an 
objective measurement.  Me, my tastes are different--I find it 
"needlessly complicated" and prefer the without-binding-expressions version.


> If it weren't for that you hate being pedantic, I'd add that you're
> overlooking the piles of leading whitespace characters also saved in
> the latter ;-)  The number of those saved grows quadratically in the
> number of uselessly indented blocks shifted left.

Surely my dislike of being pedantic is irrelevant to your being 
pedantic.  It seems it's not something you mind doing! ;-)

The nightmare scenario of quadratically right-shifted code is of course 
solvable in other ways.  Off the top of my head, one could rewrite the 
code example as follows, without any new syntax:

     def r(value):
         nonlocal reductor
         reductor = value
         return value
     if r(dispatch_table.get(cls)):
         rv = reductor(x)
     elif r(getattr(x, "__reduce_ex__", None)):
         rv = reductor(4)
     elif r(getattr(x, "__reduce__", None)):
         rv = reductor()
     else:
         raise Error("un(shallow)copyable object of type %s" % cls)


It makes this particular code example longer--however it avoids the 
indenting you seem to dislike.  However: given that "r()" is shorter 
than "reductor := ", the more elifs you add, the more overall 
/characters/ you'll save!  With another three or four elifs this version 
would probably use fewer characters overall.

I'm not seriously arguing that people rewrite their code in this way; I 
think it's less clear this way.  I suppose my point is, "we already have 
so many ways To Do It, I'm not in favor of adding syntax to the language 
to add one more way".


>> and the density of complexity per line has shot up.
> Average non-whitespace character count per line has certainly shot up,
> but I don't actually know what you mean by "density of complexity"
> there.

What I meant was, stuff-accomplished-per-line.  I guess we could measure 
that objectively as bytecodes / linecount.  But I meant it more as a 
subjective measurement, considering factors like "how many unrelated 
operations occur on this line?" and "how many operations on this line 
have side-effects?".  I suggest that the with-binding-expressions 
version has more side effects and more unrelated operations per line, 
because several lines now fold assignment (side effect) into lines that 
were previously flow control.


//arry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-dev/attachments/20180426/ed8fe17a/attachment.html>


More information about the Python-Dev mailing list