Make tuple a context manager
Could we add __enter__ and __exit__ to the tuple type? Look at the following code: a = open('a.tmp', 'w') b = open('b.tmp', 'w') with (a, b) as (af, bf): af.write("1") bf.write("2") Even better example: with tuple(open(str(_n) + '.tmp', 'w') for _n in range(1000)) as f: for n, fn in enumerate(f): f.write(str(n)) Tuple as context manager would invoke __enter__ for each of its elements and return a tuple of the results. On exit, the __exit__ method would be invoked for every element. We could even generalize it to every kind of iterable. This is somewhat consistent with treatment of exception types in 'except' clause. try: something() except Exception1 as error: handlerA(error) except (Exception2, Exception3) as error: handlerB(error) Tuple of exception types is accepted in 'except' clause, as well as a single exception type. We could apply that rule to the 'with' clause.
You should expand a bit. How is that better than with open(..) as a, open(..) as b: ?
On 12 Jul 2019, at 15:27, haael <haael@interia.pl> wrote:
Could we add __enter__ and __exit__ to the tuple type?
Look at the following code:
a = open('a.tmp', 'w') b = open('b.tmp', 'w')
with (a, b) as (af, bf): af.write("1") bf.write("2")
Even better example:
with tuple(open(str(_n) + '.tmp', 'w') for _n in range(1000)) as f: for n, fn in enumerate(f): f.write(str(n))
Tuple as context manager would invoke __enter__ for each of its elements and return a tuple of the results.
On exit, the __exit__ method would be invoked for every element.
We could even generalize it to every kind of iterable.
This is somewhat consistent with treatment of exception types in 'except' clause.
try: something() except Exception1 as error: handlerA(error) except (Exception2, Exception3) as error: handlerB(error)
Tuple of exception types is accepted in 'except' clause, as well as a single exception type. We could apply that rule to the 'with' clause. _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/QCYHV6... Code of Conduct: http://python.org/psf/codeofconduct/
Modifying the fundamental tuples for doing that is certainly overkill - but maybe a context-helper function in contextlib that would proper handle all the corner cases of some code as I've pasted now at: https://gist.github.com/jsbueno/53c059380be042e2878c08b5c10f36bf (the link above actually have working code to implement the OP sugestion as a generator-function) Not it is easier to use than contextlib.ExitStack https://docs.python.org/3/library/contextlib.html#contextlib.ExitStack Note that for literal, hard coded tuples with known size, this is not needed at all - just spell the `with resource as bla, resource2 as ble: ` syntax works - and, for an arbitrary number of resources "tuple" hardly would be the more appropriate type to use anyway. js -><- On Fri, 12 Jul 2019 at 10:57, Anders Hovmöller <boxed@killingar.net> wrote:
You should expand a bit. How is that better than
with open(..) as a, open(..) as b:
?
On 12 Jul 2019, at 15:27, haael <haael@interia.pl> wrote:
Could we add __enter__ and __exit__ to the tuple type?
Look at the following code:
a = open('a.tmp', 'w') b = open('b.tmp', 'w')
with (a, b) as (af, bf): af.write("1") bf.write("2")
Even better example:
with tuple(open(str(_n) + '.tmp', 'w') for _n in range(1000)) as f: for n, fn in enumerate(f): f.write(str(n))
Tuple as context manager would invoke __enter__ for each of its elements and return a tuple of the results.
On exit, the __exit__ method would be invoked for every element.
We could even generalize it to every kind of iterable.
This is somewhat consistent with treatment of exception types in 'except' clause.
try: something() except Exception1 as error: handlerA(error) except (Exception2, Exception3) as error: handlerB(error)
Tuple of exception types is accepted in 'except' clause, as well as a single exception type. We could apply that rule to the 'with' clause. _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/QCYHV6... Code of Conduct: http://python.org/psf/codeofconduct/
Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/YC3RJ6... Code of Conduct: http://python.org/psf/codeofconduct/
On Friday, July 12, 2019, 07:48:52 AM PDT, Joao S. O. Bueno <jsbueno@python.org.br> wrote:
Modifying the fundamental tuples for doing that is certainly overkill - but maybe a context-helper function in contextlib that would proper handle all the > corner cases of some code as I've pasted now at: Python recipe to enter an arbitrary number of contexts at once
| | | | | | | | | | | Python recipe to enter an arbitrary number of contexts at once Python recipe to enter an arbitrary number of contexts at once - itercontext.py | | |
(the link above actually have working code to implement the OP sugestion as a generator-function) But that code doesn't clean up properly if any of the enters fails, it just leaks the already-entered contexts. And if any of the exits fails, the remainder don't get cleaned up (and the original exception gets lost, too). And it exits them in entry order instead of reverse entry order. You could fix all of that by writing it around ExitStack, but in that case it's just a one-liner, and in fact the same one-liner that's the first thing in the ExitStack docs: with ExitStack() as stack: files = [stack.enter_context(open(fname)) for fname in filenames] Or, for your exact example: with ExitStack() as stack: files = [stack.enter_context(open(f"file_{i}.bin", "wb")) for i in range(5)] for i, file_ in enumerate(files): file_.write(bytes(i.to_bytes(1, "little"))) Or, you don't even need to make a list here: with ExitStack() as stack: files = (stack.enter_context(open(f"file_{i}.bin", "wb")) for i in range(5)) for i, file_ in enumerate(files): file_.write(bytes(i.to_bytes(1, "little"))) And that has the advantage that it can be easily rewritten to be a more unwieldy but easier for novices to follow, the same way as any other comprehension: with ExitStack() as stack: for i in range(5): file = stack.enter_context(open(f"file_{i}.bin", "wb") file_.write(bytes(i.to_bytes(1, "little")))
Anyway, as the docs for ExitStack say:
This is a relatively low level API that takes care of the details of correctly unwinding the stack of exit callbacks. It provides a suitable foundation for higher level context managers that manipulate the exit stack in application specific ways. And your desired API can be written by subclassing or delegating to an ExitStack without needing any advanced code, or any thought about failure handling: class itercontext(ExitStack): def __init__(self, *cms): self._contexts = [] super().__init__() self._contexts.extend(self.enter_context(cm) for cm in cms) def __iter__(self): yield from self._contexts … or, if you prefer to think about failure handling in the @contextmanager style: @contextmanager def itercontext(*cms): with ExitStack() as stack: contexts = [stack.enter_context(cm) for cm in cms] # It may take a comment to convince readers that there's nothing for try/finally to do here? yield contexts
There are a few convenience helpers that could be added to ExitStack to make these even easier to write, or even to make it usable out of the box for a wider range of scenarios. An enter_contexts(cms) function could make it clear exactly what failure does to the rest of the cms iterable. Or enter_contexts(*cms), which forces an iterator to be consumed before entering anything (as your example does). Or even __init__(*cms). It could expose its managers and/or contexts as an attribute. Ir could even be an iterable or sequence of its contexts. Then, you could just use ExitStack directly as your desired itercontext. But I don't know that those are worth adding to ExitStack, or even to a higher-level wrapper in the stdlib. I think the real problem isn't that it's too hard for novices to do this themselves as-needed, it's that it's too hard for them to discover ExitStack, to grok what it does, and to realize how easy it is to expand on. Once they get that, they can write itercontext, and anything else they need, themselves. But until they do, they'll try to write what you wrote, and either get it wrong in far worse ways or just give up. I'm not sure how to fix that. (More links to it in the docs, more examples in its own docs, rewrite the "low-level" sentence so it sounds more like an invitation than a warning, a HOWTO, vigorous proselytizing…?) But I don't think adding one or two wrappers (or, worse, less-powerful partial workalikes) to the same module would help.
On Jul 12, 2019, at 06:27, haael <haael@interia.pl> wrote:
Tuple as context manager would invoke __enter__ for each of its elements and return a tuple of the results.
On exit, the __exit__ method would be invoked for every element.
We could even generalize it to every kind of iterable.
So instead of exiting them in reverse order, it would exit them in order of entry? And how would you generalize it to every kind of itetable? Even iterators? Then the enter will exhaust the iterator, so the exit won’t see anything. Also, what does it do if one of the elements raises on entry, or on exit? ExitStack answers all of these problems. Maybe the solution is just to make it slightly easier to use ExitStack with an iterable of context managers, and a lot easier for novices to discover it?
12.07.19 16:27, haael пише:
Could we add __enter__ and __exit__ to the tuple type?
Look at the following code:
a = open('a.tmp', 'w') b = open('b.tmp', 'w')
with (a, b) as (af, bf): af.write("1") bf.write("2")
Even better example:
with tuple(open(str(_n) + '.tmp', 'w') for _n in range(1000)) as f: for n, fn in enumerate(f): f.write(str(n))
Tuple as context manager would invoke __enter__ for each of its elements and return a tuple of the results.
On exit, the __exit__ method would be invoked for every element.
Are your aware of contextlib.nested()? And why it was deprecated and removed?
Serhiy Storchaka wrote:
Are your aware of contextlib.nested()? And why it was deprecated and removed?
Was contextlib.nested() removed primarily due to to the inconsistencies mentioned in https://bugs.python.org/issue5251 or was it something else?
13.07.19 23:25, Kyle Stanley пише:
Serhiy Storchaka wrote:
Are your aware of contextlib.nested()? And why it was deprecated and removed?
Was contextlib.nested() removed primarily due to to the inconsistencies mentioned in https://bugs.python.org/issue5251 or was it something else?
Yes, it was removed because of the inherent flaw of such design. In `with nested(open('a'), open('b'))` if the second open failed, the first file will be leaked. No nested() object even be created, so this can't be fixed in it. There is exactly the same problem with using a tuple as a context manager.
On Jul 12, 2019, at 06:27, haael <haael@interia.pl> wrote:
with tuple(open(str(_n) + '.tmp', 'w') for _n in range(1000)) as f: for n, fn in enumerate(f): f.write(str(n))
Another thought here: There may be good examples for what you want—although I suspect every such example will be much better handled by ExitStack—but the one you gave is actually an argument against the idea, not for it. We don’t want to make it easier to write code like this, because people shouldn’t be writing code like this. Having 1000 files open at a time may exhaust your resource limits. Writing to another 999 files before closing the first will confuse the OS cache and slow everything down. If someone trips over the power cable in the middle, you have up to 1000 written but unflushed files that are in indeterminate state. It forces you to come up with two names for each variable, leading to mistakes like the one at the end, where you try to write to the tuple. When all of this is necessary, you have to deal with those problems, but it usually isn’t, and it isn’t here. You’re just writing to one file, then writing to the next, etc.; there’s no reason not to close each one immediately. Which you can just write like this: for n in range(1000): with open(str(n) + '.tmp', 'w') as f: f.write(str(n))
participants (6)
-
Anders Hovmöller
-
Andrew Barnert
-
haael
-
Joao S. O. Bueno
-
Kyle Stanley
-
Serhiy Storchaka