W dniu 20.02.2014 23:07, Andrew Barnert napisaĆ(a):
On Feb 20, 2014, at 13:46, Jan Kaliszewski
wrote: 20.02.2014 22:33, Yann Kaiser wrote:
The first one could be accomplished like:
x = [line.split() for line in f] with open(thisfile) as f
It would keep f opened while the listcomp is being evaluated. This makes me think however of a likely accident:
x = (line.split() for line in f) with open('name') as f next(x) # ValueError: I/O operation on closed file.
This does mirror this mistake though:
with open('name') as f: return (line.split() for line in f)
When what was meant was:
with open('name') as f: for line in f: yield line.split()
(Which is, unfortunately, a __del__ in disguise, which is frowned upon.)
Why 'unfortunately'?
Relying on __del__ to clean up your files (and other expensive resources) is a bad idea if you only use CPython, and a terrible idea if your code might ever be run on other implementations: it means you don't get deterministic cleanup.
Yup, I see the point. At least, starting with CPython 3.4, the risk of creating uncollectable reference cycle no longer exists (PEP 442). (That, of course, does not invalidate the concern about deterministic finalization of resources).
And writing something using a with statement--which was explicitly designed to solve that problem--that still relies on __del__ would be dangerously misleading.
Unless the programmer knows what (s)he does. :)
However, as I pointed out in my other message, this is not really a __del__ in disguise, because if you use this in cases where the returned iterator is fully consumed in normal circumstances (e.g., a chain of iterator transformations that feeds into a final listcomp or writerows or executemany or the like), you _do_ get deterministic cleanup.
Indeed. Regards. *j