[Python-Dev] PEP 343 rewrite complete
Guido van Rossum
gvanrossum at gmail.com
Sun Jun 5 18:33:28 CEST 2005
On 6/5/05, "Martin v. Löwis" <martin at v.loewis.de> wrote:
> Guido van Rossum wrote:
> >> @with_template
> >> def closing(obj):
> >> try:
> >> yield obj
> >> finally:
> >> obj.close()
> >>
> > I just realized this has a race condition. The bytecode for the
> > expression closing(open("...")) must necessarily contain a bytecode
> > that calls open() followed by another bytecode that calls closing().
>
> This is not a convincing point. That race condition always existed,
> e.g. in the traditional
>
> f = open(filename)
> try:
> process(f)
> finally:
> f.close()
>
> as you could always get an async exception between open returns and
> f is assigned. This isn't much of an issue, since CPython would always
> release the file immediately as the stack frame is cleared due to
> the exception.
>
> I think would should explicitly weaken our guarantees for
> "asynchronous" exceptions (of which we currently only have
> KeyboardInterrupt). The PEP should point out that an async
> exception between the beginning of the with statement and
> the assignment to the variable may or may not cause __exit__
> to be called (depending on how far you are into __enter__)
That is pretty clear from the translation given in the PEP.
What is not so clear is that the cace can be completely avoided *if*
the call to __enter__ and the try-finally setup are combined in one
opcode, *and* __enter__ is implemented in C, *and* the reversible
actions are all done by __enter__.
This is also why I don't like giving file objects __enter__ and
__exit__ methods.
I know all this doesn't matter in most situations, but sometimes it
does, and it would be good if there was a solution -- today, there
really isn't one except to rely on CPython's reference counting.
--
--Guido van Rossum (home page: http://www.python.org/~guido/)
More information about the Python-Dev
mailing list