Anonymous blocks (again):
Hello, I've banged my self against the wall, and there simply isn't a pythonic way to make a context manager iterate over the yielded-to block of code. I've had to resource to this in Grako[*]: def block(): self.rule() self.ast['rules'] = self.last_node closure(block) But what I think would be more pythonic is: closure( def: self.rule() self.ast['rules'] = self.last_node ) Or, better yet (though I know I can't have it): with positive_closure(): self.rule() self.ast['rules'] = self.last_node The thing is that a "closure" needs to call the given block repeatedly, while remaining in control of the context of each invocation. The examples given above would signal the closure to be over with an exception. The convoluted handling of exceptions on context manager's __exit__ make it impossible (for me) to construct a context manager that can call the yield-to block several times. Anyway, the anonymous def syntax, with or without parameters, is a given, and a solution for many qualms about the Python way of things. [*] https://bitbucket.org/apalala/grako -- Juancarlo *Añez*
On Sun, May 12, 2013 at 2:10 PM, Juancarlo Añez <apalala@gmail.com> wrote:
Hello,
I've banged my self against the wall, and there simply isn't a pythonic way to make a context manager iterate over the yielded-to block of code.
I've had to resource to this in Grako[*]:
def block(): self.rule() self.ast['rules'] = self.last_node closure(block)
In current Python, decorator abuse can be a reasonable option: @closure def block(): self.rule() self.ast['rules'] = self.last_node Or, if PEP 403 were accepted and implemented: @in closure(f): def f(): self.rule() self.ast['rules'] = self.last_node (The latter would have the advantage of working with arbitrary expressions) Anyway, if this is a topic that interests you, I strongly recommend reading both PEP 403 and PEP 3150 in full. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On Sun, May 12, 2013 at 3:15 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
Or, if PEP 403 were accepted and implemented:
@in closure(f): def f(): self.rule() self.ast['rules'] = self.last_node
Oops, typo in that example (the extra colon on the @in expression): @in closure(f) def f(): self.rule() self.ast['rules'] = self.last_node Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On Sun, May 12, 2013 at 12:45 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
In current Python, decorator abuse can be a reasonable option:
@closure def block(): self.rule() self.ast['rules'] = self.last_node
Buf for that to work, you'd still have to call: block() And that would make it: @closure def block(): self.rule() self.ast['rules'] = self.last_node block() Which I think makes little sense to a human reader, at least not in the pythonic way, and less so when compared to my (map/reduce..functional). closure(block) Your proposal was the approach I previously used in Grako, and I deprecated it in favor of the currently standing state of things in Python, which is: *If you want an executable block of code you can iterate upon, then define a (non-anonymous) function, and pass it to the iterator.* Cheers. -- Juancarlo *Añez*
On Sun, May 12, 2013 at 2:11 AM, Juancarlo Añez <apalala@gmail.com> wrote:
On Sun, May 12, 2013 at 12:45 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
In current Python, decorator abuse can be a reasonable option:
@closure def block(): self.rule() self.ast['rules'] = self.last_node
Buf for that to work, you'd still have to call:
block()
And that would make it:
@closure def block(): self.rule() self.ast['rules'] = self.last_node block()
No, not with how closure was defined in the original post. (Otherwise, the code would be closure(block)()). Consider:
def closure(f): ... f(); f(); f() ... @closure ... def block(): ... print "Hi" ... Hi Hi Hi
-- Devin
On Sun, May 12, 2013 at 3:48 AM, Devin Jeanpierre <jeanpierreda@gmail.com>wrote:
the code would be closure(block)()).
Consider:
def closure(f): ... f(); f(); f() ... @closure ... def block(): ... print "Hi" ... Hi Hi Hi
Mmm. Interesting, but unpythonic. A decorator that executes the target right away? I also tried: with closure: while True: block But the context-manager design is shortsighted, and it will exit on the first exception it sees, no matter what. I've tried everything, so I'm pretty sure that there's no clean solution in 2.7/3.3. Cheers, -- Juancarlo *Añez*
On Sun, May 12, 2013 at 9:56 PM, Juancarlo Añez <apalala@gmail.com> wrote:
But the context-manager design is shortsighted, and it will exit on the first exception it sees, no matter what.
That's not shortsighted, it was a deliberate design decision to *prevent* with statements from being used as a replacement for explicit while and for loops. See PEP 343.
I've tried everything, so I'm pretty sure that there's no clean solution in 2.7/3.3.
Correct. This is why PEP 403 exists. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On Sun, May 12, 2013 at 10:09 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
I've tried everything, so I'm pretty sure that there's no clean solution in 2.7/3.3.
Correct. This is why PEP 403 exists.
PEP 403 sucks! It's a very ill attempt at replacing the need for anonymous blocks, which could be done syntax very like the current one. ATIAIHTSAT! This thread is closed, AFAIC. I am in peace with the must-be-functions of Python blocks. Smarter people than me will figure things out. Cheers, -- Juancarlo *Añez*
On 5/12/2013 1:22 PM, Juancarlo Añez wrote:
On Sun, May 12, 2013 at 10:09 AM, Nick Coghlan <ncoghlan@gmail.com <mailto:ncoghlan@gmail.com>> wrote:
> I've tried everything, so I'm pretty sure that there's no clean solution in > 2.7/3.3.
Correct. This is why PEP 403 exists.
PEP 403 sucks!
There's no need for that tone. Everyone here is being respectful, you can be also. --Ned.
It's a very ill attempt at replacing the need for anonymous blocks, which could be done syntax very like the current one.
ATIAIHTSAT!
This thread is closed, AFAIC. I am in peace with the must-be-functions of Python blocks. Smarter people than me will figure things out.
Cheers,
-- Juancarlo *Añez*
_______________________________________________ Python-ideas mailing list Python-ideas@python.org http://mail.python.org/mailman/listinfo/python-ideas
On 13 May 2013 03:23, "Juancarlo Añez" <apalala@gmail.com> wrote:
On Sun, May 12, 2013 at 10:09 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
I've tried everything, so I'm pretty sure that there's no clean
solution in
2.7/3.3.
Correct. This is why PEP 403 exists.
PEP 403 sucks!
It's a very ill attempt at replacing the need for anonymous blocks, which could be done syntax very like the current one.
Anonymous blocks in Ruby depend on the convention that the block is always the last positional argument. Python has no such convention, thus any "block like" solution will require a mechanism that allows the user to tell the interpreter where the trailing callable should be referenced in the preceding simple statement. Earlier versions of PEP 403 used a magic symbol for this, but that ended up being ugly and non-obvious. Thus, I changed it to the current explicit forward reference. For throwaway callbacks, using a short meaningless name like "f" should be sufficiently brief, and in many cases a more meaningful name may be used in order to make the code more self-documenting. Now, do you have any constructive feedback on the PEP that still accounts for Python's lack of a standard location for passing callables to functions, or is this reaction simply a matter of "I don't want to have to type 'f' twice because I don't have to do that in other languages"? Regards, Nick.
ATIAIHTSAT!
This thread is closed, AFAIC. I am in peace with the must-be-functions of
Python blocks. Smarter people than me will figure things out.
Cheers,
-- Juancarlo Añez
On 13 May 2013, at 00:40, Nick Coghlan <ncoghlan@gmail.com> wrote:
Now, do you have any constructive feedback on the PEP that still accounts for Python's lack of a standard location for passing callables to functions, or is this reaction simply a matter of "I don't want to have to type 'f' twice because I don't have to do that in other languages"?
Moving past the outright negative feedback, and having only just seen the PEP, the proposed syntax did strike me as awkward and unintuitive. Maybe there is some explanation for why decorator-like syntax was used - if so, please do link me so I can read up. What struck me though is that the proposed syntax limits the ability to have multiple "anonymous blocks" within a single statement. Instead, I was thinking some syntax like the following might be nicer: in x = do_something(in_arg, success_hdlr, error_hdlr): def success_hdlr(result): ... # Do something with result def error_hdlr(error): ... # Do something with error That is instead of a decorator-like syntax, make the "in" keyword reusable to introduce a new block, whose "argument" is a statement that can forward reference some names, which are then defined within the block. This allows multiple temporary names to be defined (each in a separate statement within the block). Some further thought is required on whether only def (and maybe class) statements should be allowed within the "in" block. Although I guess there's technically nothing wrong with: in x = y + z: y = 12 z = 30 Other than it's a very verbose way of doing something simple. ;-) But maybe there are more useful examples? Cheers, Martin
Regards, Nick.
On Sun, May 12, 2013 at 5:30 PM, Martin Morrison <mm@ensoft.co.uk> wrote:
Moving past the outright negative feedback, and having only just seen the PEP, the proposed syntax did strike me as awkward and unintuitive. Maybe there is some explanation for why decorator-like syntax was used - if so, please do link me so I can read up.
I just read PEP 403 myself also. I confess I likewise have trouble getting past the unnatural feeling (to me, at least at first brush) of the decorator syntax. The rejected alternative of the 'given' keyword seems less unnatural. I wonder though why not just use the ML style here. E.g. Spell this:
in x = do_something(in_arg, success_hdlr, error_hdlr): def success_hdlr(result): ... # Do something with result def error_hdlr(error): ... # Do something with error
As: in x = do_something(in_arg, success_hdlr, error_hdlr, const) let:
def success_hdlr(result): ... # Do something with result def error_hdlr(error): ... # Do something with error const = 42
Well, I've reversed the order of in/let from ML, but the keywords are the same. But as Martin points out, I can't see any reason to preclude defining multiple one-off blocks... not even if the name definitions aren't 'def' or 'class' (hence my addition of defining 'const=42' in my slightly expanded version). Yes it's still a pseudo-block, and we do have to do something with scoping. But it reads better to me than 'given', and also better than the bare 'in' block introduction without the explicit 'let'. Yours, David... -- Keeping medicines from the bloodstreams of the sick; food from the bellies of the hungry; books from the hands of the uneducated; technology from the underdeveloped; and putting advocates of freedom in prisons. Intellectual property is to the 21st century what the slave trade was to the 16th.
Martin Morrison writes:
That is instead of a decorator-like syntax, make the "in" keyword reusable to introduce a new block, whose "argument" is a statement that can forward reference some names, which are then defined within the block. This allows multiple temporary names to be defined (each in a separate statement within the block).
This idea and its presumed defects are described (using the "given" syntax of PEP 3150) in the section "Using a nested suite" in PEP 403.
Some further thought is required on whether only def (and maybe class) statements should be allowed within the "in" block. Although I guess there's technically nothing wrong with:
in x = y + z: y = 12 z = 30
Other than it's a very verbose way of doing something simple. ;-)
Violates TOOWTDI according to PEP 403. David Mertz and Juancarlo Añez riff on the theme:
[Why not spell it something like]:
in x = do_something(in_arg, success_hdlr, error_hdlr, const) let: def success_hdlr(result): ... # Do something with result def error_hdlr(error): ... # Do something with error const = 42
(Note the "let" at the end of the "in" clause.) Python doesn't use redundant keywords for a single construct. "let" is redundant with the following "def"s. On top of that, "let" being a new keyword will kill this syntax, I think.
On Sun, May 12, 2013 at 10:23 PM, Stephen J. Turnbull <stephen@xemacs.org>wrote:
Python doesn't use redundant keywords for a single construct. "let" is redundant with the following "def"s. On top of that, "let" being a new keyword will kill this syntax, I think.
I don't want new syntax (I think I don't). What I want is to be able to invoke a block of code repeatedly, within a context, and in a pythonic way. within closure(): do_this() and_do_that() If it can't be pythonic (clear to the reader), I'm not interested. It's good enough as it is: def block(): do_this() do_that() closure(block) Cheers, -- Juancarlo *Añez*
On Sun, May 12, 2013 at 11:28 PM, Juancarlo Añez <apalala@gmail.com> wrote:
What I want is to be able to invoke a block of code repeatedly, within a context, and in a pythonic way.
within closure():
do_this() and_do_that()
Hey! I must say that I'm speaking from a niche-perspective. I'm seeking that automatically generated parsers are readable by their creators. I do think that anonymous blocks would be good in broad ways, but my interest in them at shit time is quite narrow. Cheers, -- Juancarlo *Añez*
On 13/05/13 13:58, Juancarlo Añez wrote:
I don't want new syntax (I think I don't).
What I want is to be able to invoke a block of code repeatedly, within a context, and in a pythonic way.
Surely that would be: with context(): while condition: # or a for loop block of code goes here If you want something different to this, then I think you do want new syntax. Otherwise, what do you gain beyond what can already be done now? Or am I missing something? -- Steven
On Mon, May 13, 2013 at 2:47 PM, Steven D'Aprano <steve@pearwood.info> wrote:
On 13/05/13 13:58, Juancarlo Añez wrote:
I don't want new syntax (I think I don't).
What I want is to be able to invoke a block of code repeatedly, within a context, and in a pythonic way.
Surely that would be:
with context(): while condition: # or a for loop block of code goes here
If you want something different to this, then I think you do want new syntax. Otherwise, what do you gain beyond what can already be done now?
Or am I missing something?
Ruby uses anonymous callbacks for things where Python instead uses dedicated syntax: Python -> Ruby decorated function definitions -> callbacks for loops + iterator protocol -> callbacks with statements + context management protocol -> callbacks callbacks -> callbacks (but with much nicer syntax) Blocks are a *really* nice way of doing callbacks, so nice that Ruby just doesn't have some of the concepts Python does - it uses callbacks instead. While I have no real interest in Ruby's use of embedded callbacks inside expressions (or various other pieces of control flow magic that Ruby blocks support), I *do* think their ability to easily supply a full nested callback to a single statement is valuable, and gets to the heart of people's interest in multi-line lambdas in Python. PEP 403 is mostly about adapting that feature to an ecosystem which doesn't have the "the callback is the last positional parameter" convention that the block syntax established for Ruby. Using a forward reference to a class instead lets you have multiple forward references through attribute access. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On Mon, May 13, 2013 at 03:17:15PM +1000, Nick Coghlan wrote:
On Mon, May 13, 2013 at 2:47 PM, Steven D'Aprano <steve@pearwood.info> wrote:
On 13/05/13 13:58, Juancarlo Añez wrote:
I don't want new syntax (I think I don't).
What I want is to be able to invoke a block of code repeatedly, within a context, and in a pythonic way.
Surely that would be:
with context(): while condition: # or a for loop block of code goes here
If you want something different to this, then I think you do want new syntax. Otherwise, what do you gain beyond what can already be done now?
Or am I missing something?
Ruby uses anonymous callbacks for things where Python instead uses dedicated syntax:
Python -> Ruby
decorated function definitions -> callbacks for loops + iterator protocol -> callbacks with statements + context management protocol -> callbacks callbacks -> callbacks (but with much nicer syntax)
Blocks are a *really* nice way of doing callbacks, so nice that Ruby just doesn't have some of the concepts Python does - it uses callbacks instead.
I'm obviously still missing something, because I'm aware of Ruby's blocks, but I don't quite see how they apply to Juancarlo's *specific* use-case, as described above. Unless Juancarlo's use-case is more general than I understood, it seems to me that we don't need blocks, anonymous or otherwise, to "invoke a block of code repeatedly, within a context", in a Pythonic way. Perhaps a concrete (even if toy or made-up) example might help me understand. The only thing I can think of is, if I had a bunch of similar loops inside the same context, where only the body of the loop was different, I might want to factor it out something like this: the_block = {define a block of code, somehow} def do_stuff(block): with context: while condition: {execute the block of code} do_stuff(the_block) do_stuff(another_block) but I think that requires new syntax, and Juancarlo specifically says he doesn't want new syntax. -- Steven
Perhaps a concrete (even if toy or made-up) example might help me understand.
Not sure if this example fits Juancarlo's criterion: Here is a place where I really craved for blocks and resorted to using a context manager + decorators: https://github.com/gitbot/gitbot/blob/master/gitbot/lib/s3.py#L140-L169 The use case is essentially: recursively loop through a folder and push to Amazon S3 evaluating rules for each file / folder. Here is the implementation: https://github.com/lakshmivyas/fswrap/blob/master/fswrap.py#L317-L439 Just removing the need for the decorators would make this pattern completely acceptable *for me*. Thanks Lakshmi Steven D'Aprano wrote:
On Mon, May 13, 2013 at 03:17:15PM +1000, Nick Coghlan wrote:
On Mon, May 13, 2013 at 2:47 PM, Steven D'Aprano<steve@pearwood.info> wrote:
On 13/05/13 13:58, Juancarlo Añez wrote:
I don't want new syntax (I think I don't).
What I want is to be able to invoke a block of code repeatedly, within a context, and in a pythonic way. Surely that would be:
with context(): while condition: # or a for loop block of code goes here
If you want something different to this, then I think you do want new syntax. Otherwise, what do you gain beyond what can already be done now?
Or am I missing something? Ruby uses anonymous callbacks for things where Python instead uses dedicated syntax:
Python -> Ruby
decorated function definitions -> callbacks for loops + iterator protocol -> callbacks with statements + context management protocol -> callbacks callbacks -> callbacks (but with much nicer syntax)
Blocks are a *really* nice way of doing callbacks, so nice that Ruby just doesn't have some of the concepts Python does - it uses callbacks instead.
I'm obviously still missing something, because I'm aware of Ruby's blocks, but I don't quite see how they apply to Juancarlo's *specific* use-case, as described above.
Unless Juancarlo's use-case is more general than I understood, it seems to me that we don't need blocks, anonymous or otherwise, to "invoke a block of code repeatedly, within a context", in a Pythonic way.
Perhaps a concrete (even if toy or made-up) example might help me understand. The only thing I can think of is, if I had a bunch of similar loops inside the same context, where only the body of the loop was different, I might want to factor it out something like this:
the_block = {define a block of code, somehow}
def do_stuff(block): with context: while condition: {execute the block of code}
do_stuff(the_block) do_stuff(another_block)
but I think that requires new syntax, and Juancarlo specifically says he doesn't want new syntax.
On Mon, May 13, 2013 at 4:52 AM, Lakshmi Vyas <lakshmi.vyas@gmail.com>wrote:
Here is a place where I really craved for blocks and resorted to using a context manager + decorators:
https://github.com/gitbot/gitbot/blob/master/gitbot/lib/s3.py#L140-L169
That is a VERY interesting pattern: with source.walker as walker: def ignore(name): return match_pattern(ignore_patterns, name) @walker.folder_visitor def visit_folder(folder): Make the context be the source of the decorators, and do the iteration on __exit__. This could work for me, but you must admit it is very much twisting context managers arms to the extreme. with self.closure() as c: @c def *_*(): match_this() match_that() I'd like the above to be something like (warning:new keyword ahead): within self.closure(): match_this() match_that() A clean, anonymous block. -- Juancarlo *Añez*
On Mon, May 13, 2013 at 12:17 AM, Steven D'Aprano <steve@pearwood.info>wrote:
with context(): while condition: # or a for loop block of code goes here
If you want something different to this, then I think you do want new syntax. Otherwise, what do you gain beyond what can already be done now?
Or am I missing something?
It's not obvious from my example, but the idea is that the invoker be able to provide context for _each_ iteration. Cheers, -- Juancarlo *Añez*
On Mon, May 13, 2013 at 10:39 AM, Juancarlo Añez <apalala@gmail.com> wrote:
If you want something different to this, then I think you do want new
syntax. Otherwise, what do you gain beyond what can already be done now?
Or am I missing something?
It's not obvious from my example, but the idea is that the invoker be able to provide context for _each_ iteration.
I can explain better. While parsing a closure, the parser knows it should stop iterating because the embedded expression fails to parse midway. At that point, the last iteration must be rolled back (rewind the input stream and discard syntax tree nodes). To roll back just the last iteration, the iterator needs to control the context for each iteration, because it can't predict which will be the last, the one to fail. The above works perfectly defining a function for the embedded expression using a synthetic name, like "c123", and passing it to closure(): def c123(): match_this() match_that() closure(c123) My quest is because the above seems quite unpythonic. Lakshmi suggested this pattern, which I think is an improvement: with closure() as c: @c.exp def expre(): match_this() match_that() What I think would be great is to have the action (closure) precede an anonymous block. closure( def(): match_this() match_that() ) Cheers, -- Juancarlo *Añez*
On 5/13/2013 11:40 AM, Juancarlo Añez wrote:
The above works perfectly defining a function for the embedded expression using a synthetic name, like "c123", and passing it to closure():
def c123(): match_this() match_that()
closure(c123)
My quest is because the above seems quite unpythonic.
I disagree that the above is unpythonic. In Python, functions are objects like everything else. Define them, pass them to functions, like anything else. 'Unpythonic' is treating functions as special, other than that they are called (have a call method). Terry
On Mon, May 13, 2013 at 11:58 AM, Terry Jan Reedy <tjreedy@udel.edu> wrote:
I disagree that the above is unpythonic. In Python, functions are objects like everything else. Define them, pass them to functions, like anything else. 'Unpythonic' is treating functions as special, other than that they are called (have a call method).
I beg to disagree. Functions are objects in python, but they get particular treatment. You can do: def f(): pass x = f But you can't do: x = def(): pass Cheers, -- Juancarlo *Añez*
On 5/13/2013 1:23 PM, Juancarlo Añez wrote:
On Mon, May 13, 2013 at 11:58 AM, Terry Jan Reedy <tjreedy@udel.edu <mailto:tjreedy@udel.edu>> wrote:
I disagree that the above is unpythonic. In Python, functions are objects like everything else. Define them, pass them to functions, like anything else. 'Unpythonic' is treating functions as special, other than that they are called (have a call method).
I beg to disagree. Functions are objects in python, but they get particular treatment.
So do classes, modules, ...,
You can do:
def f(): pass x = f
But you can't do:
x = def(): pass
So what? Really. There is nothing particular about that. Neither can you do x = class (): pass x = with open(..): x = import itertools x = if True: y = 3 x = <pick any statement other than expression/binding statements> The fact that Python is a mixed syntax language with both expressions and statements is fundamental to its design. -- Terry Jan Reedy
On Mon, May 13, 2013 at 1:23 PM, Terry Jan Reedy <tjreedy@udel.edu> wrote:
x = class (): pass x = with open(..): x = import itertools x = if True: y = 3 x = <pick any statement other than expression/binding statements>
I can explain. Expressions like "", [], (), {} are called "constructors" in programming language theory, because the "construct" objects, and give them an initial state. In Python, functions, methods, and classes are objects too, and their constructors are "class" and "def". But there's an asymmetry in the language in that those two constructors don't produce an assignable value. if, while, with, etc. are statements, not constructors. Other languages make those return values too, but that's not the Python way of things. Cheers, -- Juancarlo *Añez*
On Mon, May 13, 2013 at 11:33 AM, Juancarlo Añez <apalala@gmail.com> wrote:
In Python, functions, methods, and classes are objects too, and their constructors are "class" and "def". But there's an asymmetry in the language in that those two constructors don't produce an assignable value.
Not all functions produce a useful value. [].append(1) [].sort() print(1) Yes, these do return a value since functions always return a value, but they don't return a useful value. Likewise, class and def have side effects so if they were functions they would probably return None and you would have the same issue that you couldn't usefully use them inside another statement, just like this: x = print(y) Def is not a constructor. It is an assignment statement. def f(x): return x+1 f = lambda x: x+1 are equivalent. Python does not allow either one of these assignment statements to be embedded in another statement. It does allow lambda functions to be embedded in other statements. The issue here is essentially that the def syntax allows more complex function syntax than lambda. And the complaint is that that you have to declare a function "out of order" and choose a name for it. This has the same problem: # can't do this case a.b[i]: when 0: pass when 1: pass when 2 ... 4: pass else: pass # can't do this either if (_value = a.b[i]) == 0: pass elif _value == 1: pass elif _value in 2 ... 4: pass else: pass # have to do this _value = a.b[i] if _value == 0: pass elif _value == 1: pass elif _value >= 2 and value <= 4: pass else: pass This is a much more common scenario than wanting anonymous blocks. I'm not proposing to change this. I'm just pointing out that if you're complaining about not being able to assign a value inside a statement, there are more common cases to look at. --- Bruce Latest blog post: Alice's Puzzle Page http://www.vroospeak.com Learn how hackers think: http://j.mp/gruyere-security
On Mon, May 13, 2013 at 2:40 PM, Bruce Leban <bruce@leapyear.org> wrote:
This is a much more common scenario than wanting anonymous blocks. I'm not proposing to change this. I'm just pointing out that if you're complaining about not being able to assign a value inside a statement, there are more common cases to look at.
I'm not complaining. I'm just pointing out that there may be more readable ways to express things in Python than out-of-order. I know that making def and class be expressions would cause enormous problems. Because context managers are limited to their intended purpose, I proposed new syntax that would provide for anonymous blocks in certain places. Instead of having to do: def synthetic1(): self.match('a') def synthetic2(): self.match('b') self.closure(synthetic1) self.closure(synthetic2) Which is as close to cryptic as can bee, it could be: within self.closure(): self.match('a') within self.closure(): self.match('b') Which is, at least, in the right order. Adding the new keyword would allow reusing the parsing for def: within self.closure(): def(): self.match('a') within self.closure(): def(): self.match('b') So anonymous blocks can also take parameters, without the need to make def an assignable expression. Cheers, -- Juancarlo *Añez*
That many Python functions don't produce useful values makes it difficult or impossible to do "chained" or "fluent" operations a la JavaScript or Perl. I love that Python avoids "expression soup" and long run-on invocations, but the lack of appreciable support for "fluency" or even a smidgeon of functional style seems to regularly "verticalize" Python code, with several lines required do common things, such as fully construct / initialize / setup an object. Apologize if this seems tangential. To me it seems related to some of the examples in this thread where Python pushes configuration statements/calls oddly after the related code block. On Monday, May 13, 2013 3:10:14 PM UTC-4, Bruce Leban wrote:
Not all functions produce a useful value.
[].append(1) [].sort() print(1)
On May 13, 2013, at 13:19, Jonathan Eunice <jonathan.eunice@gmail.com> wrote:
That many Python functions don't produce useful values makes it difficult or impossible to do "chained" or "fluent" operations a la JavaScript or Perl.
Fluency and a clean expression/statement divide are almost directly contrary goals. Similarly, reducing "vertical" code and making all structure explicit are almost directly contrary goals. So;
I love that Python avoids "expression soup" and long run-on invocations, but the lack of appreciable support for "fluency" or even a smidgeon of functional style seems to regularly "verticalize" Python code
If you made Python fluent, you would allow, and maybe even encourage, JS-style "expression soup". It's a tradeoff, and I think Python made the right choice here. I've got a lot of logic that I migrate back and forth between Python and JS, and it's definitely true that a 3-line JS function often becomes a 5-line Python function and vice versa. But the Python function is nevertheless usually faster to read, so I don't think this is a problem.
, with several lines required do common things, such as fully construct / initialize / setup an object.
Often the answer to that is that the right API for a Python class isn't the same as the right API for a similar JS prototype. For example, because pythonic style makes much more use of exceptions than typical JS style, you don't need multistage initialization nearly as often.
In the real world, I'm not sure that there ever is a truly clean statement / expression divide. server = DataServe(...) server.start() vies with server = DataServe(...).start() or server = DataServe(..., start=True) And while there are lot of places where Python tries to make something very a statement rather than an expression (e.g. print, import, raise, or assert), one doesn't have to go very far to find variations on these that are functional. py3 itself changed its game on print. I myself do not find modest uses of fluency any less clear or explicit, and I believe it can improve the clarity of some logically-combined activities. But I'd certainly agree that the aggressive functional chaining you can find highly functional languages, or in JS, which I'll caricature as: d.find(...).filter(...).more(...).last().render() can be pretty off-putting and opaque. Through a few functional for-loops in there, as JS often does, and it's downright ugly. I don't want to take the blocks and closures discussion off-track; I'm not advocating hyper-fluency; and I freely admit that a language's omissions and constraints can be easily as important as the features it provides. I'm all about optimizing the macro result, so if having a few more vertical lines is the constraint that makes the total code more comprehensible, c'est la vie. But it does seem, to me at least, that there's a connection with some of the issues people bang into trying to specify block context and the fact that while Python constructors return a object than can be directly used, that very few of the update methods do. On Monday, May 13, 2013 5:07:03 PM UTC-4, Andrew Barnert wrote:
On May 13, 2013, at 13:19, Jonathan Eunice <jonatha...@gmail.com<javascript:>> wrote:
That many Python functions don't produce useful values makes it difficult or impossible to do "chained" or "fluent" operations a la JavaScript or Perl.
Fluency and a clean expression/statement divide are almost directly contrary goals.
Similarly, reducing "vertical" code and making all structure explicit are almost directly contrary goals.
So;
I love that Python avoids "expression soup" and long run-on invocations, but the lack of appreciable support for "fluency" or even a smidgeon of functional style seems to regularly "verticalize" Python code
If you made Python fluent, you would allow, and maybe even encourage, JS-style "expression soup". It's a tradeoff, and I think Python made the right choice here.
I've got a lot of logic that I migrate back and forth between Python and JS, and it's definitely true that a 3-line JS function often becomes a 5-line Python function and vice versa. But the Python function is nevertheless usually faster to read, so I don't think this is a problem.
, with several lines required do common things, such as fully construct / initialize / setup an object.
Often the answer to that is that the right API for a Python class isn't the same as the right API for a similar JS prototype. For example, because pythonic style makes much more use of exceptions than typical JS style, you don't need multistage initialization nearly as often.
_______________________________________________ Python-ideas mailing list Python...@python.org <javascript:> http://mail.python.org/mailman/listinfo/python-ideas
On 14/05/13 03:23, Juancarlo Añez wrote:
On Mon, May 13, 2013 at 11:58 AM, Terry Jan Reedy <tjreedy@udel.edu> wrote:
I disagree that the above is unpythonic. In Python, functions are objects like everything else. Define them, pass them to functions, like anything else. 'Unpythonic' is treating functions as special, other than that they are called (have a call method).
I beg to disagree. Functions are objects in python, but they get particular treatment.
You can do:
def f(): pass x = f
But you can't do:
x = def(): pass
That has nothing to do with *function objects*, and everything to do with the *keyword* def being a statement. If you avoid "def", you can do this: x = lambda: None or even this: from types import FunctionType x = FunctionType(code, globals, name, argdefs, closure) Creating functions in this way is not exactly convenient, but it is possible. There is a sense in which functions (and classes, and modules) are "different", not because Python chooses to treat them differently, but because they are inherently different. They are complex, almost free-form compound objects, and there is no "nice" syntax for creating them inside an expression. The closest Python comes to is lambda for functions, and that is limited to a single expression. But regardless of the difficulty of creating a function object, once you have one, it is a first class object. Anything you can do with any other object, you can do with a function object. I'm with Terry on this: the code snippet you gave, where a function object is passed to another function, is a standard Python idiom and perfectly pythonic. It's not uncommon to have to create data before you use it, even when you could create it in-place where you use it. E.g. we might choose to write: data = [lots of items here] x = some_value() result = func(x, data) instead of: result = func(some_value(), [lots of items here]) to make the function call more readable, or to keep to some maximum line length, or in order to re-use some of the values. So even when we *can* embed values in a call, often we choose not to. The fact that you don't have a choice when it comes to functions is not a major problem. Even if you could write: result = func(def (): lots of code goes here) you probably shouldn't. -- Steven
I do not think expression soup is particularly bad, it's just Javascript's implementation (as usual) that is pretty borked. In Scala, expression chaining lets you do some pretty nice things: memory.take(freePointer) .grouped(10) .map(_.fold("")(_+"\t"+_)) .reduce(_+"\n"+_) This code converts a flat array of ints into pretty 10-ints-per-row heap dumps, and is probably nicer than most things you would write using for loops and variables. In general, method chaining is the same as having an implicit "this" object that you are operating on without needing to specify it (since it gets returned by each method). Apart from saving syntax (less tokens to write) this also reduces the number of possible ways you can do things. I mean, if you write this: memory.take(freePointer) memory.grouped(10) memory.map(_.map(_+"")) memory.map(_.reduce(_+"\t"+_)) memory.reduce(_+"\n"+_) It's about the same; but how many times have you seen code like this: memory.take(freePointer) // some comment memory.grouped(10) unrelatedthing.dostuff() memory.map(_.map(_+"")) unrelatedthing.domorestuff() /** * SOME BIG COMMENT * i am cow * hear me moo * i weight twice as much as you * and i look good on the barbecue */ do_stuff_with_cows() memory.map(_.reduce(_+"\t"+_)) memory.reduce(_+"\n"+_) Which makes it a huge pain to figure out what is going on with memory? Method chaining *prevents* this sort of thing from happening in the first place, which is really nice. Even if I try to avoid this, I haven't seen any code base where this hasn't happened in various places, causing endless headaches. On Mon, May 13, 2013 at 9:36 PM, Steven D'Aprano <steve@pearwood.info>wrote:
On 14/05/13 03:23, Juancarlo Añez wrote:
On Mon, May 13, 2013 at 11:58 AM, Terry Jan Reedy <tjreedy@udel.edu> wrote:
I disagree that the above is unpythonic. In Python, functions are objects
like everything else. Define them, pass them to functions, like anything else. 'Unpythonic' is treating functions as special, other than that they are called (have a call method).
I beg to disagree. Functions are objects in python, but they get particular treatment.
You can do:
def f(): pass x = f
But you can't do:
x = def(): pass
That has nothing to do with *function objects*, and everything to do with the *keyword* def being a statement. If you avoid "def", you can do this:
x = lambda: None
or even this:
from types import FunctionType x = FunctionType(code, globals, name, argdefs, closure)
Creating functions in this way is not exactly convenient, but it is possible.
There is a sense in which functions (and classes, and modules) are "different", not because Python chooses to treat them differently, but because they are inherently different. They are complex, almost free-form compound objects, and there is no "nice" syntax for creating them inside an expression. The closest Python comes to is lambda for functions, and that is limited to a single expression.
But regardless of the difficulty of creating a function object, once you have one, it is a first class object. Anything you can do with any other object, you can do with a function object. I'm with Terry on this: the code snippet you gave, where a function object is passed to another function, is a standard Python idiom and perfectly pythonic.
It's not uncommon to have to create data before you use it, even when you could create it in-place where you use it. E.g. we might choose to write:
data = [lots of items here] x = some_value() result = func(x, data)
instead of:
result = func(some_value(), [lots of items here])
to make the function call more readable, or to keep to some maximum line length, or in order to re-use some of the values. So even when we *can* embed values in a call, often we choose not to. The fact that you don't have a choice when it comes to functions is not a major problem. Even if you could write:
result = func(def (): lots of code goes here)
you probably shouldn't.
-- Steven
______________________________**_________________ Python-ideas mailing list Python-ideas@python.org http://mail.python.org/**mailman/listinfo/python-ideas<http://mail.python.org/mailman/listinfo/python-ideas>
Haoyi Li writes:
Method chaining *prevents* [intermingling unrelated code] from happening in the first place, which is really nice. Even if I try to avoid this, I haven't seen any code base where this hasn't happened in various places, causing endless headaches.
There's nothing intrinsicly wrong with your preference, but as far as I can see it's un-Pythonic. Specifically, as you mention method chaining is equivalent to carrying along an implicit "this" argument. But in Pythonic code, explicit is better than implicit. For that reason, methods with side effects generally have a useless return value (case in point: .sort()). In other words, making method chaining difficult is a deliberate design decision. You can disagree with that decision, but you *do* have a choice of languages, so Python doesn't *need* to accomodate your preference. Also (and this is more or less specious, but I don't have enough brain cells today to decide whether it's less or more), many people have difficulty with the functional tools approach you present in lieu of iteration statements. Python does provide those tools, but they're considered power tools to be used only when the expense in maintainability by mere mortals is considered carefully, and decided to be the lesser cost.
I'm not arguing against "python is just so", I'm arguing against "function chaining is inherently ugly". If people say "we shouldn't do this in python, just because" "we shouldn't do this in python because it goes against the convention", I'm fine with that. If people say "we shouldn't do this in python, because method chaining is inherently ugly" I will disagree. No need to chase me away from the python community and ask me to use a different language =D On Mon, May 13, 2013 at 10:54 PM, Stephen J. Turnbull <stephen@xemacs.org>wrote:
Haoyi Li writes:
Method chaining *prevents* [intermingling unrelated code] from happening in the first place, which is really nice. Even if I try to avoid this, I haven't seen any code base where this hasn't happened in various places, causing endless headaches.
There's nothing intrinsicly wrong with your preference, but as far as I can see it's un-Pythonic. Specifically, as you mention method chaining is equivalent to carrying along an implicit "this" argument. But in Pythonic code, explicit is better than implicit. For that reason, methods with side effects generally have a useless return value (case in point: .sort()). In other words, making method chaining difficult is a deliberate design decision. You can disagree with that decision, but you *do* have a choice of languages, so Python doesn't *need* to accomodate your preference.
Also (and this is more or less specious, but I don't have enough brain cells today to decide whether it's less or more), many people have difficulty with the functional tools approach you present in lieu of iteration statements. Python does provide those tools, but they're considered power tools to be used only when the expense in maintainability by mere mortals is considered carefully, and decided to be the lesser cost.
On 13 May 2013 22:55, Haoyi Li <haoyi.sg@gmail.com> wrote:
I do not think expression soup is particularly bad, it's just Javascript's implementation (as usual) that is pretty borked. In Scala, expression chaining lets you do some pretty nice things:
memory.take(freePointer)
.grouped(10)
.map(_.fold("")(_+"\t"+_))
.reduce(_+"\n"+_)
I hope you know that if you enjoy this style, Python _is_ for you, and I consider it part of the "multiparadigm" language. You just have to design your methods to always return "self" - or make a class decorator to do so. But in case you are using other people's classes an adaptor for methosds that woudl return "None" is easy to achieve. I made this example a couple months ago to get it working: class Chain: def __init__(self, obj, root=None): self.__obj = obj def __getattr__(self, attr): val = getattr(self.__obj, attr) if callable(val): self.__callable = val return self return val def __call__(self, *args, **kw): val = self.__callable(*args, **kw) if val is None: return self return val Which allows, for example:
a = [] Chain(a).append(5).append(6).append(-1).sort().append(3) <__main__.Chain object at 0x12b6f50> a [-1, 5, 6, 3]
And would work in your example as well, should you have a class with the desired methods. js -><-
On 5/14/2013 8:45 AM, Joao S. O. Bueno wrote:
On 13 May 2013 22:55, Haoyi Li <haoyi.sg@gmail.com> wrote:
I do not think expression soup is particularly bad, it's just Javascript's implementation (as usual) that is pretty borked. In Scala, expression chaining lets you do some pretty nice things:
memory.take(freePointer)
.grouped(10)
.map(_.fold("")(_+"\t"+_))
.reduce(_+"\n"+_)
I hope you know that if you enjoy this style, Python _is_ for you, and I consider it part of the "multiparadigm" language.
You just have to design your methods to always return "self" - or make a class decorator to do so.
But in case you are using other people's classes an adaptor for methosds that woudl return "None" is easy to achieve.
I made this example a couple months ago to get it working:
class Chain: def __init__(self, obj, root=None): self.__obj = obj def __getattr__(self, attr): val = getattr(self.__obj, attr) if callable(val): self.__callable = val return self return val def __call__(self, *args, **kw): val = self.__callable(*args, **kw) if val is None: return self return val
None val should not always be converted to self. Consider [None].pop(), where None is the real return, not a placeholder for no return.
Which allows, for example:
a = [] Chain(a).append(5).append(6).append(-1).sort().append(3)
Which either raises or does the wrong thing for list.pop
<__main__.Chain object at 0x12b6f50>
a [-1, 5, 6, 3]
And would work in your example as well, should you have a class with the desired methods.
On May 14, 2013, at 8:22, Terry Jan Reedy <tjreedy@udel.edu> wrote:
On 5/14/2013 8:45 AM, Joao S. O. Bueno wrote:
On 13 May 2013 22:55, Haoyi Li <haoyi.sg@gmail.com> wrote:
I do not think expression soup is particularly bad, it's just Javascript's implementation (as usual) that is pretty borked. In Scala, expression chaining lets you do some pretty nice things:
memory.take(freePointer)
.grouped(10)
.map(_.fold("")(_+"\t"+_))
.reduce(_+"\n"+_)
I hope you know that if you enjoy this style, Python _is_ for you, and I consider it part of the "multiparadigm" language.
You just have to design your methods to always return "self" - or make a class decorator to do so.
But in case you are using other people's classes an adaptor for methosds that woudl return "None" is easy to achieve.
I made this example a couple months ago to get it working:
class Chain: def __init__(self, obj, root=None): self.__obj = obj def __getattr__(self, attr): val = getattr(self.__obj, attr) if callable(val): self.__callable = val return self return val def __call__(self, *args, **kw): val = self.__callable(*args, **kw) if val is None: return self return val
None val should not always be converted to self. Consider [None].pop(), where None is the real return, not a placeholder for no return.
I think this gets to a key issue. Chain is relying on the fact that all methods return something useful, or None. But None is something useful. This is a limitation that a language like Haskell doesn't have. You could have a "list of A" type whose methods all return "maybe A", meaning they return "Just something useful" or Nothing. In that case, Nothing is not something useful (but Just Nothing is). The benefit that comes along with this limitation is duck typing. We don't need a monad, type constructors, type deconstruction matching, and function-lifting functions because duck typing gets us 80% of the benefit for no effort. The downside is that we don't get that extra 20%. You don't have to build a lifting function when the need is implicit, but you _can't_ build a lifting function when you explicitly want it. I think that's a good tradeoff, but it is still a tradeoff. And Python's consistency is part of what makes it a good tradeoff. JavaScript has even more flexibility in this area, so in theory it should be even more powerful. But in practice, it's not. And that's because it doesn't have consistent rules that define the boundaries--instead of "mutators don't return", it's "some mutators return this, others return the argument, others don't return". So, where Python has problems with the other-20% side of its tradeoff, JavaScript has the same problems with the whole 100%, so it gets a lot less benefit out of the dynamic typing tradeoff.
Which allows, for example:
a = [] Chain(a).append(5).append(6).append(-1).sort().append(3)
Which either raises or does the wrong thing for list.pop
<__main__.Chain object at 0x12b6f50>
a [-1, 5, 6, 3]
And would work in your example as well, should you have a class with the desired methods.
_______________________________________________ Python-ideas mailing list Python-ideas@python.org http://mail.python.org/mailman/listinfo/python-ideas
Andrew, I agree with 100% of what you said; you put it much more clearly than I could. #soapbox My point in stirring up all this is that we should not confuse "empirically bad" with "unpythonic". It is fine to reject something just because it's unpythonic, and does not fit into the rest of python's ecosystem, and it's fine to reject something because it is empirically bad. However, it is a mistake to say something (e.g. method chaining) is empirically bad *because* it is unpythonic. I say this because I think it does happen, whether intentionally or subconsciously. It's easy to blindly chant mantras like "explicit is better than implicit", but that blinds us to the deep and insightful trade-offs in these decisions, one of which is, of course, consistency with the rest of the ecosystem ("pythonicity"). #end soapbox Sorry for hijacking your thread! I think a better implementation for anonymous blocks (however it turns out) would be a wonderful thing; I've also abused decorators to do a lot of these things @sys.meta_path.append @singleton class ImportFinder(object): ... I also think being able to pass multiple anonymous blocks into a function could also greatly reduce unnecessary uses of inheritance; I'm sure everyone has encountered situations where you are inheriting from a class, not because you actually *want* to create objects, but simply because you want to override one or more of the methods that the class has, such that when you call imp.do_stuff(), it will use the overriding methods. In this way, inheritance is often used as an poor substitute for passing in multiple blocks into some function; it's a poor substitute because it is far more powerful than necessary, adding tons syntactic boilerplate and greatly increasing the number of things that can go wrong, when all you want is a pure function into which you can pass more than one block to customize it's behavior. On Tue, May 14, 2013 at 12:57 PM, Andrew Barnert <abarnert@yahoo.com> wrote:
On May 14, 2013, at 8:22, Terry Jan Reedy <tjreedy@udel.edu> wrote:
On 5/14/2013 8:45 AM, Joao S. O. Bueno wrote:
On 13 May 2013 22:55, Haoyi Li <haoyi.sg@gmail.com> wrote:
I do not think expression soup is particularly bad, it's just Javascript's implementation (as usual) that is pretty borked. In Scala, expression chaining lets you do some pretty nice things:
memory.take(freePointer)
.grouped(10)
.map(_.fold("")(_+"\t"+_))
.reduce(_+"\n"+_)
I hope you know that if you enjoy this style, Python _is_ for you, and I consider it part of the "multiparadigm" language.
You just have to design your methods to always return "self" - or make a class decorator to do so.
But in case you are using other people's classes an adaptor for methosds that woudl return "None" is easy to achieve.
I made this example a couple months ago to get it working:
class Chain: def __init__(self, obj, root=None): self.__obj = obj def __getattr__(self, attr): val = getattr(self.__obj, attr) if callable(val): self.__callable = val return self return val def __call__(self, *args, **kw): val = self.__callable(*args, **kw) if val is None: return self return val
None val should not always be converted to self. Consider [None].pop(), where None is the real return, not a placeholder for no return.
I think this gets to a key issue.
Chain is relying on the fact that all methods return something useful, or None. But None is something useful.
This is a limitation that a language like Haskell doesn't have. You could have a "list of A" type whose methods all return "maybe A", meaning they return "Just something useful" or Nothing. In that case, Nothing is not something useful (but Just Nothing is).
The benefit that comes along with this limitation is duck typing. We don't need a monad, type constructors, type deconstruction matching, and function-lifting functions because duck typing gets us 80% of the benefit for no effort. The downside is that we don't get that extra 20%. You don't have to build a lifting function when the need is implicit, but you _can't_ build a lifting function when you explicitly want it.
I think that's a good tradeoff, but it is still a tradeoff.
And Python's consistency is part of what makes it a good tradeoff. JavaScript has even more flexibility in this area, so in theory it should be even more powerful. But in practice, it's not. And that's because it doesn't have consistent rules that define the boundaries--instead of "mutators don't return", it's "some mutators return this, others return the argument, others don't return". So, where Python has problems with the other-20% side of its tradeoff, JavaScript has the same problems with the whole 100%, so it gets a lot less benefit out of the dynamic typing tradeoff.
Which allows, for example:
a = [] Chain(a).append(5).append(6).append(-1).sort().append(3)
Which either raises or does the wrong thing for list.pop
<__main__.Chain object at 0x12b6f50>
a [-1, 5, 6, 3]
And would work in your example as well, should you have a class with the desired methods.
_______________________________________________ Python-ideas mailing list Python-ideas@python.org http://mail.python.org/mailman/listinfo/python-ideas
Python-ideas mailing list Python-ideas@python.org http://mail.python.org/mailman/listinfo/python-ideas
On 5/14/2013 1:34 PM, Haoyi Li wrote:
Andrew, I agree with 100% of what you said; you put it much more clearly than I could.
Ditto.
#soapbox My point in stirring up all this is that we should not confuse "empirically bad" with "unpythonic". It is fine to reject something just because it's unpythonic, and does not fit into the rest of python's ecosystem, and it's fine to reject something because it is empirically bad.
However, it is a mistake to say something (e.g. method chaining) is empirically bad *because* it is unpythonic.
I would almost say that doing so is unpythonic ;-). Method chainging in itself *is* python when done, for instance, with the subset of string methods that return a string. I am sure that one can find things like f.read().string().lower() in the stdlib. This gets to Andrew points. As for your soapbox issue: People sometimes misuse Tim Peter's Zen of Python points. He wrote them to stimulate thought, not serve as a substitute for thought, and certainly not be a pile of mudballs to be used to chase people away. I regard 'python' versus 'unpythonic' much the same way. Misused, but with a proper use. I occasionally use it as a summary lead in followed by an explanation ('I think this is unpythonic in that ...'). And by the way, as for your macro module: I like that Python can be and is used to do 'crazy' things, even if I think something is too crazy for the stdlib. (I also notice that standards change. Metaclasses were originally a crazy hack. If I remember correctly, they were formally supported as part of unifying types and classes. 10 years later, we are just adding the 1st stdlib module use thereof.) -- Terry Jan Reedy
On Tue, May 14, 2013 at 05:24:47PM -0400, Terry Jan Reedy wrote:
As for your soapbox issue: People sometimes misuse Tim Peter's Zen of Python points. He wrote them to stimulate thought, not serve as a substitute for thought, and certainly not be a pile of mudballs to be used to chase people away.
+1000 -- Steven
Steven D'Aprano writes:
On Tue, May 14, 2013 at 05:24:47PM -0400, Terry Jan Reedy wrote:
As for your soapbox issue:
Which is, strictly speaking, off-topic, since he disclaims the intent to advocate a change to Python, let alone propose a concrete change. His post on method chaining was thoughtful, but really, it's hardly fair to criticize anyone for confusing a-Pythonicity of method chaining with some inherent flaw in that style. This list is for advocating changes to Python, and I don't see how my post could be construed in that context as claiming the method-chaining style is inherently flawed, vs. inappropriate for further support in Python.
People sometimes misuse Tim Peter's Zen of Python points. He wrote them to stimulate thought, not serve as a substitute for thought, and certainly not be a pile of mudballs to be used to chase people away.
+1000
As the person who cited "Pythonicity" and several points of the Zen in this thread, I would appreciate instruction as to how I "misused" the terms or "substituted mudballs for thought", rather than simply being bashed for criticizing someone else's (admittedly thoughtful) post. And especially not being bashed at a multiplication factor of 1000. @Haoyi: I'm sorry. I do apologize for the implication that anybody should "go away". I should have been more careful with pronouns. The "you" who likes method chaining was intended to be different from the "you" who "*do* have the choice of languages". But I could have, and should have, avoided writing "you" in the second case. You are clearly making useful contributions in code as well as in discussion, and I'd appreciate you staying around for a while. A long while. The intended point, restated, was "If method chaining were not well- supported in other languages, it would be arguable that this might be a good innovation for Python even though it's not positively Pythonic. But it is well-supported elsewhere, so there is no *need* for it in Python." I can see how one might disagree with that, but I hope noone finds it offensive.
No offense taken =) On Tue, May 14, 2013 at 11:00 PM, Stephen J. Turnbull <stephen@xemacs.org>wrote:
Steven D'Aprano writes:
On Tue, May 14, 2013 at 05:24:47PM -0400, Terry Jan Reedy wrote:
As for your soapbox issue:
Which is, strictly speaking, off-topic, since he disclaims the intent to advocate a change to Python, let alone propose a concrete change.
His post on method chaining was thoughtful, but really, it's hardly fair to criticize anyone for confusing a-Pythonicity of method chaining with some inherent flaw in that style. This list is for advocating changes to Python, and I don't see how my post could be construed in that context as claiming the method-chaining style is inherently flawed, vs. inappropriate for further support in Python.
People sometimes misuse Tim Peter's Zen of Python points. He wrote them to stimulate thought, not serve as a substitute for thought, and certainly not be a pile of mudballs to be used to chase people away.
+1000
As the person who cited "Pythonicity" and several points of the Zen in this thread, I would appreciate instruction as to how I "misused" the terms or "substituted mudballs for thought", rather than simply being bashed for criticizing someone else's (admittedly thoughtful) post. And especially not being bashed at a multiplication factor of 1000.
@Haoyi: I'm sorry. I do apologize for the implication that anybody should "go away". I should have been more careful with pronouns. The "you" who likes method chaining was intended to be different from the "you" who "*do* have the choice of languages". But I could have, and should have, avoided writing "you" in the second case. You are clearly making useful contributions in code as well as in discussion, and I'd appreciate you staying around for a while. A long while.
The intended point, restated, was "If method chaining were not well- supported in other languages, it would be arguable that this might be a good innovation for Python even though it's not positively Pythonic. But it is well-supported elsewhere, so there is no *need* for it in Python." I can see how one might disagree with that, but I hope noone finds it offensive.
_______________________________________________ Python-ideas mailing list Python-ideas@python.org http://mail.python.org/mailman/listinfo/python-ideas
On Wed, May 15, 2013 at 12:00:19PM +0900, Stephen J. Turnbull wrote:
Steven D'Aprano writes:
On Tue, May 14, 2013 at 05:24:47PM -0400, Terry Jan Reedy wrote: [...]
People sometimes misuse Tim Peter's Zen of Python points. He wrote them to stimulate thought, not serve as a substitute for thought, and certainly not be a pile of mudballs to be used to chase people away.
+1000
As the person who cited "Pythonicity" and several points of the Zen in this thread, I would appreciate instruction as to how I "misused" the terms or "substituted mudballs for thought", rather than simply being bashed for criticizing someone else's (admittedly thoughtful) post. And especially not being bashed at a multiplication factor of 1000.
I'm sorry, I did not intend my agreement to be read as a criticism of you. To be perfectly honest, I may not have even read your earlier emails. (These threads tend to be long, and my time is not unlimited.) I was agreeing with Terry as a general point. I too see far too many people throwing out misapplied references to the Zen, or as an knee-jerk way to avoid thinking about a problem. (Especially "Only One Way", which isn't even in the Zen.) I'm not going to name names, because (1) I don't remember specific examples, and (2) even if I did, it wouldn't be productive to shame people for misapplying the Zen long after the fact. Hell, it's quite likely that I have been one of those people, I know that sometimes I react conservatively to some suggestions, perhaps *too* conservatively. So I'm sorry that you read my agreement as a criticism of your comments, it was not intended that way, it was just me being enthusiastic to agree with Terry's reminder that we all should avoid using the Zen to avoid thought. -- Steven
Stephen, the post by Haoyi that I responded to neither mentioned you nor quoted you. Nothing I wrote in response to that post was specifically aimed at you. 'Some people' meant some people, here, on python-list, occasionaly pydev, blogs, stackoverflow, ... . Steven D'Aprano, who also frequents python-list, has seen many of the same posts that stimulated my comment, and obviously had a similar reaction. If you are labelling my post spam, I disagree. but lets drop it. Terry
I would be lying if i I wasn't personally turned off a bit by Stepehen Turnbull's post, and I think most of us were, so let's not beat around the bush saying "but *technically*... we didn't mentioned his *name*..." the context was clear enough. It was more-or-less in reference to what he said even if we didn't say so explicitly. I think it is clear to all of us that it was his post that sparked off my #soapboxing, and we can talk about the reasons why it can be interpreted as chasing people away (I don't think that's up for debate), whether he meant it that way or not. It's probably to do with the "python doesn't need you, go use another language" motif, but I won't do any further literary analysis on the sentences. But he's apologized, so any personal offense that was taken has now been discarded =) On Wed, May 15, 2013 at 4:36 AM, Terry Jan Reedy <tjreedy@udel.edu> wrote:
Stephen, the post by Haoyi that I responded to neither mentioned you nor quoted you. Nothing I wrote in response to that post was specifically aimed at you. 'Some people' meant some people, here, on python-list, occasionaly pydev, blogs, stackoverflow, ... . Steven D'Aprano, who also frequents python-list, has seen many of the same posts that stimulated my comment, and obviously had a similar reaction.
If you are labelling my post spam, I disagree. but lets drop it.
Terry
______________________________**_________________ Python-ideas mailing list Python-ideas@python.org http://mail.python.org/**mailman/listinfo/python-ideas<http://mail.python.org/mailman/listinfo/python-ideas>
On May 13, 2013, at 18:55, Haoyi Li <haoyi.sg@gmail.com> wrote:
I do not think expression soup is particularly bad, it's just Javascript's implementation (as usual) that is pretty borked. In Scala, expression chaining lets you do some pretty nice things:
memory.take(freePointer)
.grouped(10)
.map(_.fold("")(_+"\t"+_))
.reduce(_+"\n"+_)
In Python, all of these are non-mutating functions that return a new value. And you _can_ chain them together. Exactly the same way you would in Lisp, ML, or Haskell, or even JavaScript. What JavaScript and other "fluent" languages add is a way to chain together _mutating_ methods. This hides a fundamental distinction between map and append, or sorted and sort. And that part is the problem. The mutability distinction is closely tied to the expression/assignment distinction. Notice that modern fluent languages also try to make _everything_ an expression, even for loops. And traditional (lispy) functional languages that don't have statements build the equivalent of the expression/statement distinction on top of mutability (e.g., set special forms). You could point out that Scala is more readable than the Python equivalent, something like this: reduce(lambda ..., map(reduce(... grouped(10, ... And yes, that's a mess. But that's a separate issue. It's not because map doesn't return anything, it's because map isn't a method of list. If your point about extra code creeping into the middle is that chaining methods, as opposed to chaining functions calls, discourages unnecessarily turning expressions into statements... That's an interesting point. But a separate argument from fluency, None-returning mutators, and statements that aren't expressions.
Juancarlo Añez wrote:
What I want is to be able to invoke a block of code repeatedly, within a context, and in a pythonic way.
within closure(): do_this() and_do_that()
Whenever this kind of thing has been considered before, one of the stumbling blocks has always been how to handle things like this: while something: within closure(): do_this() if moon_is_blue: break and_do_that() This is one of the main reasons that the with-statement ended up being implemented the way it is, instead of by passing an implicit closure. -- Greg
On 13 May 2013, at 03:53, "Stephen J. Turnbull" <stephen@xemacs.org> wrote:
Martin Morrison writes:
That is instead of a decorator-like syntax, make the "in" keyword reusable to introduce a new block, whose "argument" is a statement that can forward reference some names, which are then defined within the block. This allows multiple temporary names to be defined (each in a separate statement within the block).
This idea and its presumed defects are described (using the "given" syntax of PEP 3150) in the section "Using a nested suite" in PEP 403.
[Thanks for pointing this out - I seemed to miss the crux of that in my first reading] The general argument against seems to be that it makes implementation difficult, and PEP3150 has (what to me is) very awkward syntax involving leading dots, etc. to try to help the implementation. This feels wrong to me (sacrificing functionality/clarity for ease of implementation - at least during the first pass). Also, would it not be possible to implement it something like: in <stmt>: <suite> Becomes: _locals = dict() exec(<suite>, globals(), _locals) _composed = MagicDict(write_to=locals(), first_read_from=_locals) exec(<stmt>, globals(), _composed) del _locals, _composed (to clarify: MagicDict is some way of composing two dicts so that write operations always go to a specific dict, but read operations first go to another dict. i.e. Names are first looked up in the temporary locals created from the suite, then in the 'real' locals, but all new name bindings go into the actual local scope) There may need to be some special handling for 'nonlocal' if required, but 'global' should Just Work. Also there is no need for any of the statement restrictions discussed in PEP3150.
Some further thought is required on whether only def (and maybe class) statements should be allowed within the "in" block. Although I guess there's technically nothing wrong with:
in x = y + z: y = 12 z = 30
Other than it's a very verbose way of doing something simple. ;-)
Violates TOOWTDI according to PEP 403.
Inherently since the attempt of these PEPs is to reorder some code, they violate TOOWTDI. However, the reordering provides substantial benefits to readability. In that sense, they are very much like decorators (and thus I can see why some attempt was made to align the syntaxes). In any case, TOOWTDI refers to only one *obvious* way to do it (at least obvious if you are Dutch ;-)).
David Mertz and Juancarlo Añez riff on the theme:
[Why not spell it something like]:
in x = do_something(in_arg, success_hdlr, error_hdlr, const) let: def success_hdlr(result): ... # Do something with result def error_hdlr(error): ... # Do something with error const = 42
(Note the "let" at the end of the "in" clause.)
Python doesn't use redundant keywords for a single construct. "let" is redundant with the following "def"s. On top of that, "let" being a new keyword will kill this syntax, I think.
Agreed. I also prefer a prefix (in my example "in") over the "given" or "let" postfix. It makes it obvious as I approach the construct that it is going to do something different, rather than requiring me to look at the end of the line to see the colon. cf. with, for, while, if - in fact all complex statements. Cheers, Martin
Just a reminder of why I started this discussion. I couldn't find a way to have: with context(): do_this() do_that() in which the embedded block would be executed multiple times, through yield. My fault! Lack of understanding about the context manager protocol. It is not possible to yield to the block multiple times because an exception within the block will exit the context no matter what __exit__() does. Adding new syntax is the kill-all way of things, but I think that it may be possible to achieve what I think is reasonable by adding to the context manager protocol. Perhaps I should just settle for something like: with context() as c: white True: with c.inner(): do_this() do_that() But that has the problem that the context manager protocol forces try: finally: to be spread between __enter__() and __exit__(). The best solution I've seen so far doesn't require any changes: with context() as c: @c def _(): do_this() do_that() Make the decorator remember the decorated function, and let __exit__() do the iteration. The only problem with that is that the stack trace for an exception would be weird. That's why I thought of new syntax: within context() as c: do_this() do_that() Python would def the anonymous block, and pass it to a __do__(block) method in the context. That wouldn't allow passing parameters to the anonymous block, but the block can use the context (c), and the variables over which it has visibility. All that said, I'm happy with how things work now, except for the need to come up with synthetic names for what should be anonymous functions. Cheers, -- Juancarlo *Añez*
On Tue, May 14, 2013 at 7:35 AM, Juancarlo Añez <apalala@gmail.com> wrote:
Just a reminder of why I started this discussion.
I couldn't find a way to have:
with context(): do_this() do_that()
in which the embedded block would be executed multiple times, through yield. My fault! Lack of understanding about the context manager protocol.
It is not possible to yield to the block multiple times because an exception within the block will exit the context no matter what __exit__() does.
Have you considered an iterator that produces context managers rather than the other way around? Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
Nick, On Tue, May 14, 2013 at 2:06 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
Have you considered an iterator that produces context managers rather than the other way around?
What an interesting idea! for c in self.closure(): with c do: match_this() match_that() I don't know yet if it's doable, but it certainly looks good. I'll try. Thanks! Cheers, -- Juancarlo *Añez*
participants (16)
-
Andrew Barnert -
Bruce Leban -
David Mertz -
Devin Jeanpierre -
Greg Ewing -
Haoyi Li -
Joao S. O. Bueno -
Jonathan Eunice -
Juancarlo Añez -
Lakshmi Vyas -
Martin Morrison -
Ned Batchelder -
Nick Coghlan -
Stephen J. Turnbull -
Steven D'Aprano -
Terry Jan Reedy