Hello,
It seems to me that a lot of the time that I use a break statement, it
is directly after an if. Typically:
while True:
do something
if condition:
break
do something else
I don't like so much the if .. break which is spread over two lines.
Of course I could write
while True:
do something
if condition: break
do something else
It doesn't read so well either IMHO. I think that this would look
better:
while True:
do something
break if condition
do something else
Now I've had a quick look on a py3k tree (a bit old, rev. 64270):
$ grep -r --include "*.py" break py3k/ | wc -l
1418
1418 uses of the break statement in the py3k python code.
$ grep -r --include "*.py" -B 1 break py3k/ | grep if | wc -l
680
Of which 680 are immediately preceded by an if statement
$ grep -r --include "*.py" "if .*: break" py3k/ | wc -l
107
Of which 107 are preceded by an if on the same line
This means that:
* 48% of uses of "break" are directly after an "if"
* this has been written on one single line about 16% of the time.
(I know my greps will include a few false positive but I don't think
they are significant :)
--
Arnaud
On 21 Sep 2008, at 20:24, Josiah Carlson wrote:
[...]
> Not all statements; all *control flow* statements. Return, yield,
> continue, break, assert, ..., can all change program flow. To say
> that break and continue should be special cased is silly, as "Special
> cases aren't special enough to break the rules". As such, why
> continue and break but not return or yield? Further, the syntax is
> *so very similar* to conditional expressions <X> if <C> vs. <X> if <C>
> else <Y>, the lack of an else clause could be confusing to those who
> have seen and used conditional expressions before, never mind the
> meaning of them.
Well my view was that break and continue are the only two statements
that relate to loops.
[...]
>> I don't agree with that: the absence of do .. while liberates the
>> loop
>> construct in python from its constraints and the first form above
>> becomes
>> more common.
>
> But you were just arguing that the *lack* of do/while makes the
> embedded if necessary. Now you are saying that it *liberates* us from
> the control-flow induced by do/while. ;) There's an argument that
> says rather than treat the symptom (breaks in the body), treat the
> disease (lack of a do/while). But since the lack of a do/while isn't
> a disease, by your own words, then the symptom is not a bug, it's a
> feature ;)
>
There is a missing link in your interpretation of my argumentation.
It is that I haved noticed that, as I do not have a do .. while
construct at my disposal in Python, I do not try to shape my loops
into this structure anymore. I almost *never* write:
while True:
...
if condition: break
But most of the time it seems that the correct structure for a loop
comes as
while True:
...
if condition: break
...
In fact, I would be happy with getting rid of the while loop and
replacing it with a simple 'loop' constuct, where:
while condition:
...
would be written as:
loop:
if condition: break
...
However I see this as too radical to propose :)
>>> Which *has* an idiom in Python.
>>>
>>> first = 1
>>> while first or condition:
>>> first = 0
>>> ...
>>>
>>
>> I would not use this, not because it is slower, but because it is
>> uglier.
>
> That's funny, because whenever I use it in a production codebase,
> coworkers always comment how they like it because it pushes the
> condition for the loop at the loop header rather than embedding it in
> the (sometimes long) body. In particular, I've seen the lack of a
> do-while in Python result in an if clause at the bottom of a 150 line
> while, which had been confusing as hell for anyone who got to touch
> that code. Moving it up to the top made it clear and resulted in a
> removal of half a dozen lines of comments at the top explaining why
> the loop was constructed like this. After the above translation,
> those comments became "this emulates a do-while clause".
>
This makes a lot of sense, if you use a do .. while concept in your
loops. Now I feel bad that I used the word "uglier" ...
>
>> Anyway, there hasn't been a flurry of positive responses so far so
>> I don't
>> think this is going to go much further than this reply...
>
> Syntax changes are tough to get agreement on in Python.
>
Thank you for your comments, they all make a lot of sense.
--
Arnaud
Slightly OT, but I think 'break' and 'continue' should be replaced
with 'raise Break' and 'raise Continue' in Python 4000, just as we
'raise StopIteration' in generators today. This would be handy,
because you could use it in functions called by a list comprehension:
def while(x):
if x > 10:
raise Break
else:
return x
[while(x) for x in range(20)] #produces [0, 1, 2, 3, 4, 5, 6, 7, 8,
9, 10]
Right now, you can do something this with list(generator expression)
by raising StopIteration, but it is considered a hack, and it doesn't
work with list comprehensions.
Okay, for anyone who's still willing to bear with me, I have a
completely different approach that might be more palatable (and perhaps
even appealing) to some people.
I was reading the documentation for Io ( http://www.iolanguage.com ),
particulary about how Io implements control structures: in short, it
doesn't. What it's able to do is let the user define control structures
in Io itself. For example "if" and "for" are merely functions:
for ( i, 1, 10, i println )
if ( b == 0, c + 1, d )
Now, the reason this works in Io (and not in Python) is because while
you *can* implement a function that mimics the logic of, say, an
if-statement, there's no way to implement lazy evaluation of the
resulting values. So for instance:
def IF ( condition, iftrue, iffalse ):
if condition: return iftrue
return iffalse
result = IF ( hasattr ( spam, 'eggs' ), spam.eggs, 'not found' )
The expression spam.eggs is evaluated regardless of the truthiness of
the condition (in fact, causes an exception), so this construct is
basically worthless.
So... now, here's my shiny new idea: lazy evaluation of function
arguments (by some explicit syntactical indicator). Arguments would not
be evaluated until they are encountered *at runtime* in the function
body. They could be compiled, but not evaluated.
You can accomplish this to some degree by using lambda, but I'd much
prefer something indicated in the function signature than in the calling
code. That is rather than:
def foo ( arg ):
return arg ( )
foo ( lambda: x )
I'd prefer to write
def foo ( lambda: arg ):
return arg # evaluation happens here
foo ( x )
("lambda" not necessarily being the actual name of the token in the
second case).
I think this would have appeal outside of my particular desire (lazy
evaluation is a useful feature in-and-of itself) but would also solve my
particular issue (I could implement functional versions of any control
structure I like, and even implement new ones).
Thoughts?
Regards,
Cliff
Is there a good reason why I can't do something like:
import mypackage
import mymodule
x = mypackage(arg, kwarg)
y = mymodule + 3
and other methods, etc. using the __special__ methods.
Also, wouldn't it be cool if modules/packages could have metaclass-
like handlers…
I'm sure there's a good reason why this is just crazy talk
implementation-wise, but I'd like to hear what it is. Python has
merged types and classes, why not merging in packages and modules, too?
-- Carl Johnson
Greetings,
Something that has started to annoy me in the last couple of years is
the fact that most Python control statements cannot be used as
expressions. I feel this is a pretty deep limitation and personally I
don't feel it's well-justified.
As I understand it, the reason for the distinction mostly has to do with
the premise "flat is better than nested", which I can understand, but I
don't think carries enough weight anymore.
Specifically, I'd like to see things like "if" statements, "for" loops,
etc, become expressions. This would not preclude them from being used
as if they were statements (an expression can stand alone on a line of
Python code), but would add a lot of expressiveness to the language as
well as make Python more viable for creating DSLs.
Additionally, removing statements from Python would also allow the
language to be simplified. No need for a ternary "if" operator with
different semantics than a normal "if" statement, "for" loops would be
brought closer to generators in functionality, and the perceived
limitations of lambda would disappear, amongst other things. We'd gain
a lot of features found in languages like Lisp and Ruby as a side-effect
(i.e. anonymous code blocks).
Overall it seems this design decision is specifically geared toward
forcing programmers into an imperative style in order to enforce program
readability. In Python 1.5, this made a bit of sense, but as Python has
"matured" (or in my view, gotten over-complicated) this makes much less
sense. Many parts of Python's extensive syntax are explicit workarounds
to this design decision. So on the one hand we have the perceived
danger that programmers will write nested code and on the other we have
an ever-expanding syntax. I'd take the former any day.
I've not delved into the internals of the Python interpreter to check,
but I suspect that converting most statements to expressions would not
be very difficult (changing the grammar definition and generated
bytecode a small amount in most cases).
Any thoughts on this? I'm sure it's been brought up before, but I
haven't found any definitive discussions on why this rather arbitrary
design decision continues to hold in the face of a general migration
away from imperative languages (especially when it seems it could be
changed without much backwards-compatibility issues).
Regards,
Cliff
Here is a quick and dirty draft of a xml generator that uses the with statement:
http://twoday.tuwien.ac.at/pub/files/XmlMarkup (ZIP, 3 KB)
It is inspired by rubies XmlMarkup class.
Brief usage:
>>> from __future__ import with_statement
>>> from XmlMarkup import *
>>> import sys
>>>
>>> with XmlMarkup(sys.stdout) as xm:
>>> with xm.root:
>>> xm.text('foo')
>>> with xm.prefixMapping('x','http://example.com/x'):
>>> with xm.tag.ns('http://example.com/x'):
>>> xm.comment('comment')
>>> with xm['text']:
>>> xm.text('bar')
>>> with xm.tag(foo='bar',egg='spam'):
>>> pass
<?xml version="1.0" encoding="utf-8"?>
<root>foo<x:tag
xmlns:x="http://example.com/x"><!--comment--></x:tag><text>bar</text><tag
foo="bar" egg="spam"></tag></root>
I'm not 100% sure about some parts of the 'syntax', though.
E.g. maybe change this:
>>> with xm.tag(x='y').ns('...'):
>>> with xm.tag:
>>> ...
into this:
>>> with xm.tag('...',x='y'):
>>> with xm.tag():
>>> ...
This Syntax is more concise than those unhandy and error prone
beginElement/endElement calls (endElement needs the name as argument, too!).
This way you never forget to close a tag again. :)
It even provides a simple way to embed arbitrary data:
>>> with xm.text as text:
>>> # text is a pseudo file object
>>> with XmlMarkup(text) as xm2:
>>> # things you generate with xm2 will be escaped
And I added a way to generate DTDs:
>>> with xm.dtd('foo') as dtd:
>>> dtd.element('a',oneof('x','y',PCDATA))
>>> dtd.element('x',('egg','bacon',many('spam')))
>>> dtd.attlist('a',att1 = (CDATA, DEFAULT, 'value'), att2 = (ID, REQUIRED))
>>> dtd.entity('ent','value')
What do you think? :)
-panzi
On Fri, 2008-09-12 at 11:00 -0400, Mike Meyer wrote:
> On Thu, 11 Sep 2008 18:01:55 -0700
> Cliff Wells <cliff(a)develix.com> wrote:
>
> > On Thu, 2008-09-11 at 15:14 -0700, Bruce Leban wrote:
> > > I agree that making every statement also an expression has merits, as
> > > does FP. However, you're handwaving about how this fits into python.
> >
> > Sure. I intentionally handwave for two reasons:
> >
> > 1) it's a bit of a waste of time to detail something like this when it's
> > universally rejected
> > 2) I didn't want to bicker over details. I wanted to forward the
> > concept.
>
> Ah, I get it. I thought you knew something about the Python user
> community. Clearly, that's not the case.
I'll concede I more-or-less stopped reading c.l.py at least 3 or 4 years
ago ;-)
> If you look through the
> proposed changes to python, you'll find this kind of thing (changes
> that introduce a single powerful feature that let you manipulate
> statements in expressions) surfacing at regular intervals over at
> least the past 15 years. The idea is almost never rejected
> outright. In fact, if you look, that's what's happening here - people
> aren't rejecting your position outright, but trying to get enough
> information to figure out how it's going to affect the entire
> language.
Yes, I accept the responsibility for conflating what I see as a problem
with a possible solution to that problem (the two were inextricably
linked in my mind).
> That's another place where the Python community differs from other
> language communities. We don't believe in features for the sake of
> features (whether you get them by adding or removing things); we
> believe the features in the language should interact well with each
> other. So while some feature may scratch a particular itch - say to do
> FP programming in Python - the fact that it turns ugly when used in
> non-FP code, or requires ignoring fundamental language layout rules,
> are not things that tie one hand behind your back, but crucial
> problems that must be solved if the feature is to be added. So even
> though the idea behind a feature may be generally acceptable - and
> that seems to be the case here - it won't be adopted until those
> details get worked out.
Agreed, and I had intended to move to that stage once I felt there was a
consensus that there is actually something that needs solving. What I
failed to make clear in my position is that I'm not trying to add a
feature for a it's own sake, rather that the current process for
limiting the growth of the Python language is doomed to failure.
I'm going to try to rephrase the fundamental problem I see here and
hopefully I won't fail so dismally this time:
1) I assert that small is better than large when it comes to limiting
complexity.
2) The primary method of limiting Python's core feature growth is for
the BDFL to dig in his heels and reject ideas until they are adequately
shown to be broadly needed and implementable.
3) This method has failed in the past*. In fact, I assert that this
method is guaranteed to fail unless all ideas that add syntactical
structures are rejected outright, regardless of their utility. It
cannot limit the growth of Python's core, it can only limit the rate at
which it grows.
4) As Python moves into new domains, as new programming languages come
into vogue, and as the art of programming itself advances (albeit
glacially), users will perceive needs in Python and will clamor for
extensions. Some of these will eventually be accepted.
5) Given the added desire to maintain backwards-compatibility, old
features will not be shed as fast as new ones are added (unless that
becomes part of the process, which doesn't seem practicable to me).
6) I believe that a large class of these features could be rendered moot
or redundant if the language embraced a more sweeping and fundamental
change. While this won't absolutely prevent the language's growth, it
provides a built-in deterrent.
* See the ternary if-operator for an example - steadily rejected by GvR
for many years and then finally accepted, probably out of exhaustion.
There's an old quote by Larry Wall: "Perl is ugly because people wanted
it that way". Perl took the same approach to limiting features that
Python does, with the notable difference that Larry didn't reject ideas
as consistently as Guido (or apparently at all). Nevertheless, we're
now simply discussing the *rate* at which the language becomes large and
inconsistent rather than whether it will or not.
If we agree that the issues I outline above are valid, then I think we
can start bickering over possible solutions and how those solutions
would affect Python on the whole.
I apologize for my previous response to you. It was clearly my own
failing to properly explain my position that led to our exchange.
Regards,
Cliff
> On Sat, 2008-09-13 at 23:58, Cliff Wells wrote:
> I understand that any such change would need to be adequately defined.
> But I don't consider the discussion to have progressed to that point.
> If people do not even understand that expression-oriented languages
> provide any advantage (in fact, many of them apparently consider it a
> disadvantage), then there's little point in discussing what particular
> syntax this apparently useless change would take on.
Are you sure that calculating and returning a value for each statement
that is interpreted as an expression is really an advantage? I find little
cases that shows such an advantage on real world code (one of each is the
ternary operator, that was introduced in Python 2.5 to fill that "hole").
Just to be clear, I think that returning a value after executing a for,
while, and even an if statement/expression EVERY TIME would be of no
pratical use except for very rare cases.
Consider that you have to spend resources (CPU and memory) calculating
values, that most of the time will be trashed because they will not be
used.
I know that we already have functions that work the same way: they always
return something, defaulting to None when no return statement is used.
That's because Python has only a "function" type (contrary to Pascal,
which distinguish between functions and procedures), but I think that
Guido opted to reduce the complexity of the language giving just one
subroutine invocation type to use.
Do we really need to slow down the language (at the cost of less
readability too, because I'm strongly convinced that using statements like
expressions will reduce it) for such limited "added value"?
My 2 cents.
Cesare
On Fri, 2008-09-12 at 14:03 -0400, Mike Meyer wrote:
> On Fri, 12 Sep 2008 08:55:51 -0700
> Cliff Wells <cliff(a)develix.com> wrote:
> > 2) The primary method of limiting Python's core feature growth is for
> > the BDFL to dig in his heels and reject ideas until they are adequately
> > shown to be broadly needed and implementable.
>
> I don't think you have accurately described things, but it's
> irrelevant because changing this doesn't change the nature of the
> problem.
It may be grossly oversimplified, but I think ultimately the process
ends this way.
> > 3) This method has failed in the past*. In fact, I assert that this
> > method is guaranteed to fail unless all ideas that add syntactical
> > structures are rejected outright, regardless of their utility. It
> > cannot limit the growth of Python's core, it can only limit the rate at
> > which it grows.
>
> I think you're using "fail" in two different contexts here; one for an
> individual case - in that something was added - and one in the global
> case - in that the language will continue to grow without bounds.
I'm claiming that the global case is an inevitable conclusion of
repeatedly catering to the individual case.
> We'll leave aside the first version - whether some particular feature
> was a failure or not is really a question of personal taste, and
> changeable - and look at the latter.
>
> You're right, they can only limit the growth rate. I contend that that
> isn't a failure - at least not until the language starts losing users
> because the feature count has gotten too large. Based on the success
> of languages where features are treated as valuable just by being
> features, we're still a very long way from that point.
I won't disagree that Python is still a long way from that point. I
also want absolutely to distance myself from any claim that Python will
one day become obsolete or even lose a single user because of this.
First of all because it's flamebait and secondly because I don't believe
it to be true.
What I will claim is that the growing the language in such a way is not
doing a favor to its users (even if they might think so for their
particular extension). I don't think any of us disagree with that.
Of course it's mildly ironic that I'm being told the same thing about my
proposal, but what I am trying to convey is that it's my belief that my
one modification helps eliminate the need for dozens of others.
> > 5) Given the added desire to maintain backwards-compatibility, old
> > features will not be shed as fast as new ones are added (unless that
> > becomes part of the process, which doesn't seem practicable to me).
>
> This has already become part of the process. Python 3.0 can break
> backwards compatibility, and so is shedding features - or at least
> moving them out of the core. For instance, the builtin functions that
> are now subsumed into list comprehensions are gone. It was originally
> scheduled for release late this month, but it looks like the schedule
> has slipped a couple of weeks.
Yes, but we want to keep watershed instances like these to an absolute
minimum. I think removing the redundant (and less flexible) variations
that listcomp now subsumes is a good thing. What I'm saying is "let's
do it again". Except what would be subsumed isn't just a handful of
functions, but rather a whole class of existing and future language
extensions.
> > 6) I believe that a large class of these features could be rendered moot
> > or redundant if the language embraced a more sweeping and fundamental
> > change. While this won't absolutely prevent the language's growth, it
> > provides a built-in deterrent.
>
> This may well be true. But the absolutely critical issue is that it
> not break the fundamental nature of language. I.e. - if you have to
> add delimters to replace indentation, you ain't going anywhere.
Absolutely, which is why I've never proposed it. It would fundamentally
alter the visual appearance of the language and ultimately isn't
necessary anyway.
> > There's an old quote by Larry Wall: "Perl is ugly because people wanted
> > it that way". Perl took the same approach to limiting features that
> > Python does, with the notable difference that Larry didn't reject ideas
> > as consistently as Guido (or apparently at all). Nevertheless, we're
> > now simply discussing the *rate* at which the language becomes large and
> > inconsistent rather than whether it will or not.
>
> LW used to sign books with the note "There's more than one way to do
> it". As far as I can tell, the Perl community never met a feature it
> didn't like, and they're all in Perl5 - and Perl6 is going to make CL
> look small.
>
> But you just slipped in a new word that you haven't used before -
> "inconsistent". Becoming large does not necessarily make things
> inconsistent. Most of the features added to Python don't make it
> inconsistent. Actually, the nastiest problems with some of them are
> *because* they are consistent with the rest of the language(*).
True, I was thinking more of Perl when that word popped out of my
keyboard.
However, I think that there is a fundamental *logical* inconsistency in
Python. The language consistently adheres to this inconsistency =) but
it's there nonetheless. This was my motivation for attempting to
demonstrate that the two forms of "if" currently in Python (statement
and operator) could be logically combined into one if the restrictions
on the former were lifted (see below).
> Maintaining consistency is far more important than limiting size, and
> is why most proposals are rejected. I don't feel that any of the
> recent changes have introduced inconsistencies in the language.
Ah, but I do:
if cond: block
vs
expr if cond
Here we have two forms of the *very same* logical construction. A
programmer must select one or the other based on *context*. This is my
definition of inconsistent (although it's roots lie in a deeper
inconsistency, namely the distinction between expressions and
statements).
Regards,
Cliff