[Jeremy]
... That leads me to wonder what exactly the rationale for generator expressions is. The PEP says that "time, clarity, and memory are conserved by using an generator expression" but I can only see how memory is conserved. That is, I don't find them any easier to read than list comprehensions
They're not, although they can be more clear than code that defines helper generating functions to get some of the same memory benefits.
and I don't understand the performance implications very well. It's not obvious to me that their faster.
When you've got an iterator producing a billion elements, it becomes obvious at once <wink>. Really, *when* they're faster than listcomps, it's mostly a consequence of not creating in whole, then crawling over, a giant memory object. For short sequences, I expect listcomps are faster (and earlier timings have shown that). genexps require an additional frame suspend/resume per element, and while cheap (esp. compared to a function call) it's not free. For long sequences, avoiding the creation of a giant list becomes an overwhelming advantage.
Anyone know where to find these numbers. I've done some searching, but I can't find them. It would be good to include something concrete in the PEP.
This varies so wildly across platforms, timing procedure, and test case, that I expect concrete numbers would do more harm than good. The qualitative argument is easy to grasp. BTW, the most recent round of this was in the "genexps slow?" thread, started last month and spilling into April. Here's the start of it: http://mail.python.org/pipermail/python-dev/2004-March/043777.html