Re: [Python-Dev] map, filter, reduce, lambda
On Thursday, Jan 2, 2003, at 21:13 US/Eastern, python-dev-request@python.org wrote:
map, reduce, filter, apply: They are just abstractions which take a function and let them work on arguments in certain ways. They are almost obsolete now since they can be replaced by powerful constructs like comprehensions and the very cute asterisk calls.
Short answer: -1 for removing map/reduce/filter/apply unless doing so will vastly improve the Python core (i.e. less size, greater speed, smaller footprint -- some combination in a usefully large fashion). Personally, I find comprehensions and very cute asterisk calls to be incredibly unreadable. map(), reduce(), filter() and apply() are all easy to read and provide a keyword via which determining their meaning in the documentation is a trivial task. Comprehensions were incomprehensible the first 10 times I ran into them. A few years ago and spanning a number of years before that, I did a huge amount of Python programming. I ended up taking about a 2.5 year break from serious Python programming-- not for any particularly good reason, in hindsight-- and returned to Python in the last year. Somewhere in that break, list comprehensions were added. This is comprehensible, but looks alien in light of everything Python I had learned in the past (substitute a simple, but real world, comprehension into here): [x for x in range(0,3)] This hurt when I had to figure it out and the essence of perl seemed rife within: [(x,y) for x in range(0,3) for y in range(0,3)] And this just seemed to be a positively perlescent way of doing things: [(x,y) for x in range(0,4) if x is not 2 for y in range(0,4) if y is not 2] Yuck, yuck, yuck! And it isn't just me! I have a number of peers who have deep roots in CS, QA, and/or Project Management with whom we tend to discuss random computing issues.... every single one agreed that list comprehensions are mighty powerful, but have gnarly syntax that is not in line with the general "zen" of Python. The asterisk stuff seems to be a similar bit of syntactic magic that is equally as baffling to both the novice and to experienced Python programmers that have missed a year or two of the language's evolution. As a friend would say-- and not in a good way-- "That's positively Perlescent!". I know there isn't a snowballs chance in hell of these things changing anytime in the future... but before other similar features/extensions are added [dictionary comprehensions, for example], I beg of the community to step back and ask how much power is really gained in balance against the total confusion that may be caused? Python is a brilliant teaching tool-- I have taught OO programming to a number of people through the use of Python without them feeling overtly challenged by mechanics and esoterica that was outside of the focus of learning. Comprehensions and the asterisk notation have definitely taken away from the elegant simplicity of the language. Both very powerful constructs that I have grown relatively comfortable with. With power, comes a price... the result of using such constructs is that my code has become significantly less approachable by relative newcomers to Python-- including my co-workers. It also means that /usr/lib/python*/*.py has become less of a source of learning for folks new to python. There is a lot of power in the core set of functional API found in the builtins. Probably the most powerful feature of things like map/filter/apply/reduce are that they are easily approached and consumed by the Python newcomer, thereby making the language more attractive by making it easier to do powerful things without learning esoteric/alien syntax. (The other construct that bent my brain for a bit were generators-- only because of what they did, not because of the pythonic implementation. Very cool. I like generators.) b.bum
Bill> This hurt when I had to figure it out and the essence of perl seemed Bill> rife within: Bill> [(x,y) for x in range(0,3) for y in range(0,3)] I find it helps if you indent them: [(x,y) for x in range(0,3) for y in range(0,3)] or something similar. Bill> [(x,y) for x in range(0,4) if x is not 2 for y in range(0,4) if y is not 2] Bill> Yuck, yuck, yuck! And it isn't just me! Well, yeah. So don't write list comprehensions that are that complex, or at least not without a little indentation, and not until you see them in your dreams: [(x,y) for x in range(0,4) if x is not 2 for y in range(0,4) if y is not 2] It does seem a bit Perlish with all the trailing conditions when they get that complex. I'm pretty sure I've never written a listcomp with more than a single for and a single if. My list elements (the (x,y) part in your examples) tend to get a bit more complicated than your examples though, involving function calls or arithmetic expressions. Bill> Comprehensions and the asterisk notation have definitely taken Bill> away from the elegant simplicity of the language. Both very Bill> powerful constructs that I have grown relatively comfortable with. By "asterisk notation" I assume you mean the alternate to apply(). Note that apply() has been deprecated, so I suspect you are in the minority (at least within the python-dev community). Most people seem to prefer *-notation to apply(). I know I do. Bill> There is a lot of power in the core set of functional API found in Bill> the builtins. Probably the most powerful feature of things like Bill> map/filter/apply/reduce are that they are easily approached and Bill> consumed by the Python newcomer, thereby making the language more Bill> attractive by making it easier to do powerful things without Bill> learning esoteric/alien syntax. One thing I think people on this list have to keep in mind is that we are not average programmers as a whole (I'm fairly certain I drag down the class average a bit...), so what might seem second nature to us may well not make any sense to a programmer whose entire pre-Python experience was with Excel and VBA. I tend to think the functional stuff is harder to approach as a newcomer. It certainly was harder for me than many other topics when I took a programming languages course in college that dabbled in Lisp (and SNOBOL, and APL, and Algol). In any case, none of *-notation, list comprehensions, or functional builtins have to be taught to rank beginners. They are all like espresso, best consumed in fairly small quantities. Bill> (The other construct that bent my brain for a bit were generators Bill> -- only because of what they did, not because of the pythonic Bill> implementation. Very cool. I like generators.) Which I still have next to no experience with other than adding the occasional yield to Tim's spambayes tokenizer. I've yet to write one because I needed or wanted it. I guess it's to each his own. Skip
On Friday, Jan 3, 2003, at 01:33 US/Eastern, Skip Montanaro wrote:
Well, yeah. So don't write list comprehensions that are that complex, or at least not without a little indentation, and not until you see them in your dreams:
This isn't completely about *my* coding style. One of the reasons why I was so attracted to Python back in, what, 1993 or 1994 or so was that it didn't matter *who* wrote the code, the syntax of the language and the design of the libraries was such that it just about anyone's code was comprehensible unless they actively worked to make it incomprehensible. Given that I had just spent 18 months doing very hard core OO perl with a relational backing store [Sybase], Python was like a bolt from the blue. Obfuscation was not a feature, the library code was readable, and I could immediately see that I would have a chance of being able to actually read my code six months after I had written it (and, yes, I succumbed to posting a 'python is superior to perl; i have seen the light' type message somewhere not long thereafter). List comprehensions and the */** notation are very powerful, but they also make the language less approachable to the newcomer to Python. At least, I think they do and most of the people I have run into think they do, as well. I'm not, for a moment, advocating that such features be deprecated or changed -- they are here and here to stay. At my experience level, I am comfortable with said constructs and use them heavily [though, I don't use */** as often as I probably should]. But this isn't just about me. Python is being used as a teaching tool and every experienced Python programmer was once a novice. The language itself is harder to learn now than it was 8 years ago. That is balanced by better documentation. It would be helpful to have a documentation with a syntactic table of contents. I.e. something like the tutorial, but with sections with two line titles -- the first being a title as found in the Tutorial [List Comprehensions] and a subtitle that is an actual example of a List Comprehension (don't know how you would do generators in such a fashion). I would have found this to be an incredibly useful learning tool when returning to python after a couple of year hiatus. b.bum
Bill Bumgarner wrote:
On Thursday, Jan 2, 2003, at 21:13 US/Eastern, python-dev-request@python.org wrote:
map, reduce, filter, apply: They are just abstractions which take a function and let them work on arguments in certain ways. They are almost obsolete now since they can be replaced by powerful constructs like comprehensions and the very cute asterisk calls.
Short answer: -1 for removing map/reduce/filter/apply unless doing so will vastly improve the Python core (i.e. less size, greater speed, smaller footprint -- some combination in a usefully large fashion).
Personally, I find comprehensions and very cute asterisk calls to be incredibly unreadable.
I think it is a little too late to recognize this *now*. Also I agree that comprehensions are not my favorite construct, since I personally like the functional approach. Where I absolutely cannot follow you is why you dislike the asterisk notation so much? I see this as one of the most elegant addition to Python of the last years, since it creates a symmetric treatment of argument definition and argument passing. Anyway, this is not the place to discuss personal taste. My message only tried to spell that map, filter and reduce can be easily emulated, while lambda cannot. It is a unique feature. ciao - chris -- Christian Tismer :^) mailto:tismer@tismer.com Mission Impossible 5oftware : Have a break! Take a ride on Python's Johannes-Niemeyer-Weg 9a : *Starship* http://starship.python.net/ 14109 Berlin : PGP key -> http://wwwkeys.pgp.net/ work +49 30 89 09 53 34 home +49 30 802 86 56 pager +49 173 24 18 776 PGP 0x57F3BF04 9064 F4E1 D754 C2FF 1619 305B C09C 5A3B 57F3 BF04 whom do you want to sponsor today? http://www.stackless.com/
On Friday, Jan 3, 2003, at 05:19 US/Eastern, Christian Tismer wrote:
Where I absolutely cannot follow you is why you dislike the asterisk notation so much? I see this as one of the most elegant addition to Python of the last years, since it creates a symmetric treatment of argument definition and argument passing.
Elegance does not always mean easy to understand. What is the old adage: Something significantly advanced will seem like pure magic to the layman? I just don't want to see the future directions of Python lose the incredible strength of being such a straightforward, yet still very powerful, language. To the newcomer, map/reduce/filter/apply, list comprehensions, lambda, and */** are all relatively impoderable. An * or ** in the arglist of a function call/definition is extremely easy to gloss over because the newcomer has no idea what they mean. Doing so, leaves the newcomer without a clue as to what is going on in the call/definition. At least with map/reduce/filter/apply, there is a WORD to latch onto and spoken/read (not computer) languages are generally all about words with punctuation taking a somewhat secondary role. Most people do not learn languages by studying the tutorials/documentation at great length-- the first exposure is often under circumstances where too little time is alloted to learning the language. map/filter/reduce may be less powerful and less elegant, but they are a heck of a lot easier to look up in the documentation. More and more commonly, that means going to google and typing in 'python filter documentation'. Try going to google and typing in 'python [] documentation' or 'python * documentation'.... b.bum
Bill Bumgarner wrote:
On Friday, Jan 3, 2003, at 05:19 US/Eastern, Christian Tismer wrote:
Where I absolutely cannot follow you is why you dislike the asterisk notation so much? I see this as one of the most elegant addition to Python of the last years, since it creates a symmetric treatment of argument definition and argument passing.
Elegance does not always mean easy to understand. What is the old adage: Something significantly advanced will seem like pure magic to the layman?
I just don't want to see the future directions of Python lose the incredible strength of being such a straightforward, yet still very powerful, language.
To the newcomer, map/reduce/filter/apply, list comprehensions, lambda, and */** are all relatively impoderable. An * or ** in the arglist of a function call/definition is extremely easy to gloss over because the newcomer has no idea what they mean. Doing so, leaves the newcomer without a clue as to what is going on in the call/definition. At least with map/reduce/filter/apply, there is a WORD to latch onto and spoken/read (not computer) languages are generally all about words with punctuation taking a somewhat secondary role.
Most people do not learn languages by studying the tutorials/documentation at great length-- the first exposure is often under circumstances where too little time is alloted to learning the language.
map/filter/reduce may be less powerful and less elegant, but they are a heck of a lot easier to look up in the documentation. More and more commonly, that means going to google and typing in 'python filter documentation'.
Try going to google and typing in 'python [] documentation' or 'python * documentation'....
i agree with your viewpoint. List Comprehensions break the uniformity of the simple original python syntax. Bigger list comprehension constructs and lambda/map etc. constructs are equally bad to read IMHO. I say this although i like to do 'oneliners' where people run away screaming. While the '*/**' notation adds syntactic complexity it doesn't feel too bad to me. (although i am currently refactoring a module to not use '*/**' anymore because it gets hard to read). holger
On Fri, 3 Jan 2003, holger krekel wrote:
While the '*/**' notation adds syntactic complexity it doesn't feel too bad to me. (although i am currently refactoring a module to not use '*/**' anymore because it gets hard to read).
I agree: */** syntax makes for hard(er) to read code. However, I make a point of only using it in code that _should_ be hard(er) to read, because it is _doing_ somethat hard. Any feature can be naively overused to the point of becoming a mental blinder. This applies equally to lambda, list comprehensions, dictionaries, objects, print, indentation, chaining methods, etc... The key is to be a smart language-feature consumer and not over do the sugary features. My rule is to ration out 'cool tricks' to at most one per 4 lines of code or 2 lines of comments. -Kevin -- Kevin Jacobs The OPAL Group - Enterprise Systems Architect Voice: (216) 986-0710 x 19 E-mail: jacobs@theopalgroup.com Fax: (216) 986-0714 WWW: http://www.theopalgroup.com
"KJ" == Kevin Jacobs
writes:
KJ> On Fri, 3 Jan 2003, holger krekel wrote:
While the '*/**' notation adds syntactic complexity it doesn't feel too bad to me. (although i am currently refactoring a module to not use '*/**' anymore because it gets hard to read).
KJ> I agree: */** syntax makes for hard(er) to read code. However, KJ> I make a point of only using it in code that _should_ be KJ> hard(er) to read, because it is _doing_ somethat hard. I can't recall a necessary use of apply that was easy to read. The problem that these two feature solve is just messy. With apply, there's usually explicit, hard-to-read tuple create to pass to apply. The extended call syntax eliminates that particular mess, so I'm very happy with it. Jeremy
While the '*/**' notation adds syntactic complexity it doesn't feel too bad to me. (although i am currently refactoring a module to not use '*/**' anymore because it gets hard to read).
I agree: */** syntax makes for hard(er) to read code. However, I make a point of only using it in code that _should_ be hard(er) to read, because it is _doing_ somethat hard.
Any feature can be naively overused to the point of becoming a mental blinder. This applies equally to lambda, list comprehensions, dictionaries, objects, print, indentation, chaining methods, etc... The key is to be a smart language-feature consumer and not over do the sugary features. My rule is to ration out 'cool tricks' to at most one per 4 lines of code or 2 lines of comments.
I recently learned that some people use apply() unnecessary whenever the function is something computed by an expression, e.g. apply(dict[key], (x, y)) rather than dict[key](x, y) More recently I learned that some people use * and ** unnecessarily too: f(x, y, *[a, b], **{}) for f(x, y, a, b) Finally I note that * and ** are slightly more flexibly than apply: you can write f(a, *args) where with apply you'd have to write apply(f, (a,)+args) And, of course, * and ** should be no strangers to anyone who has ever coded a function *declaration* using varargs or variable keyword arguments. --Guido van Rossum (home page: http://www.python.org/~guido/)
You may recall that a while ago I mentioned that gnu binutils 2.13 has bugs in how it handles dynamic linking that can prevent Python from building on Solaris machines, and that 2.13.1 does not correct those bugs completely. Well, I am happy to report that the recently released binutils 2.13.2 appears to have fixed those bugs, and I can now build Python again without having to patch binutils first. -- Andrew Koenig, ark@research.att.com, http://www.research.att.com/info/ark
Andrew> You may recall that a while ago I mentioned that gnu binutils Andrew> 2.13 has bugs in how it handles dynamic linking that can prevent Andrew> Python from building on Solaris machines, and that 2.13.1 does Andrew> not correct those bugs completely. Can you narrow down the Solaris version for me? I'll update the warning in the README file of the distribution. Skip
Andrew> You may recall that a while ago I mentioned that gnu binutils Andrew> 2.13 has bugs in how it handles dynamic linking that can prevent Andrew> Python from building on Solaris machines, and that 2.13.1 does Andrew> not correct those bugs completely. Skip> Can you narrow down the Solaris version for me? I'll update the warning in Skip> the README file of the distribution. I have encountered the problem on Solaris 2.7 and 2.8. I would expect it to be present in 2.9. I don't think it's present in 2.6, but I'm not sure and can no longer test it easily. If you like, I can send you patches to binutils 2.13 and 2.13.1, but I suspect that for almost everyone, upgrading to 2.13.2 will be the easiest course.
Skip> Can you narrow down the Solaris version for me? I'll update the Skip> warning in the README file of the distribution. Andrew> I have encountered the problem on Solaris 2.7 and 2.8. I would Andrew> expect it to be present in 2.9. I don't think it's present in Andrew> 2.6, but I'm not sure and can no longer test it easily. Thanks, that will do. Andrew> If you like, I can send you patches to binutils 2.13 and 2.13.1, Andrew> but I suspect that for almost everyone, upgrading to 2.13.2 will Andrew> be the easiest course. Thanks for the offer, but I wouldn't know what to do with them (thankfully). :-) Skip
Bill> This hurt when I had to figure it out and the essence of perl seemed Bill> rife within: Bill> [(x,y) for x in range(0,3) for y in range(0,3)] Reminds me more of Fortran, except that the loop indices nest in the oppposite direction... -- Andrew Koenig, ark@research.att.com, http://www.research.att.com/info/ark
participants (8)
-
Andrew Koenig
-
Bill Bumgarner
-
Christian Tismer
-
Guido van Rossum
-
holger krekel
-
Jeremy Hylton
-
Kevin Jacobs
-
Skip Montanaro