On Mon, 18 Nov 2019 at 13:12, Soni L. email@example.com wrote:
On 2019-11-18 9:10 a.m., Oscar Benjamin wrote:
To me that seems clumsy and awkward compared to nested though:
with nested(*map(open, filenames)) as files: ...
Ideally I would design nested to take an iterable rather than *args and then it would be fine to do e.g.
with nested(open(filename) for filename in filenames) as files: ...
Here nested could take advantage of the delayed evaluation in the generator expression to invoke the __enter__ methods and call __exit__ on the opened files if any of the open calls fails. This would also leave a "trap" though since using a list comprehension would suffer the same problem as if nested took *args:
with nested([open(filename) for filename in filenames]) as files: ...
If generator expressions (aka "(open(filename) for filename in filenames)") had __enter__ and __exit__ that deferred to inner __enter__ and __exit__, this "trap" wouldn't exist:
with (open(filename) for filename in filenames) as files: ... # fine
with [open(filename) for filename in filenames] as files: ... # raises because list doesn't __enter__
mainly because it wouldn't work with arbitrary iterators or iterables.
(and if you need it to, "with (x for x in iterable)" would still be available)
Since generators already have a close method the obvious thing for generator.__exit__ to do (if it existed) would be to call that. That would make it possible to use patterns like:
def cat(filenames): for filename in filenames: with open(filename) as infile: yield from infile
with cat(filenames) as lines: for line in lines: if key in line: return line
Note that the return there stops iterating over the generator while it is suspended. Making the generator a context manager whose __exit__ calls close ensures that the context manager inside the generator is finalised.