[Python-3000] Removing __del__
Jim Jewett
jimjjewett at gmail.com
Fri Sep 22 15:53:27 CEST 2006
On 9/22/06, Chermside, Michael <mchermside at ingdirect.com> wrote:
> the code in your __close__ method (or
> __del__) must assume that it might have been in a reference loop
> which was broken in some arbitrary place. As a result, it cannot
> assume that all references it holds are still valid.
Most close methods already assume this; how well they defend against it varies.
> A second problem I know of is, what if the code stores a reference
> to self someplace? The ability for __del__ methods to resurrect
> the object being finalized is one of the major sources of
> complexity in the GC module, and changing the semantics to
> __close__ doesn't fix this.
Even if this were forbidden, __close__ could still create a new object
that revived some otherwise-dead subobjects. Needing those exact
subobjects (as opposed to a newly created equivalent) is the only
justification I've seen for keeping the original __del__ semantics.
(And even then, I think we should have __close__ as well, for the
normal case.)
> We will modify the original example by adding a flush()
> method which flushes the resources and calling it in close():
The more careful close methods already either check a flag attribute
or use try-except.
> class MyClass2(object):
> def __init__(self, resource1_name, resource2_name):
> self.resource1 = acquire_resource(resource1_name)
> self.resource2 = acquire_resource(resource2_name)
> def flush(self):
> self.resource1.flush()
> self.resource2.flush()
> if hasattr(self, 'next'):
> self.next.flush()
Do the two resources need to be as correct as possible, or as in-sync
as possible?
If they need to be as correct as possible, this would be
def flush(self):
try:
self.resource1.flush()
except Exception:
pass
try:
self.resource2.flush()
except Exception:
pass
try:
self.next.flush() # no need to check for self.next --
just eat the exception
except Exception:
pass
Note that this is an additional motivation for exception expressions.
(Or, at least, some way to write "This may fail -- I don't care" in
less than four lines.)
> def close(self):
> self.resource1.release()
> self.resource2.release()
> def __close__(self):
> self.flush()
> self.close()
If the resources instead need to be as in-sync as possible, then keep
the original flush, but replace __close__ with
def __close__(self):
try:
self.flush()
except Exception:
pass
self.close() # exceptions here will be swallowed anyhow
> The other problem I discussed is illustrated by the following
> malicious code:
> evil_list = []
> class MyEvilClass(object):
> def __close__(self):
> evil_list.append(self)
> Do the proponents of __close__ propose a way of prohibiting
> this behavior? Or do we continue to include complicated
> logic the GC module to support it? I don't think anyone
> cares how this code behaves so long as it doesn't segfault.
I'll again point to the standard library module subprocess, where
MyEvilClass ~= subprocess.Popen
MyEvilClass.__close__ ~= subprocess.Popen.__del__
evil_list ~= subprocess._active
It does the append only conditionally -- if it is still waiting for
the subprocess *and* python as a whole is not shutting down.
People do care how that code behaves. If the decision is not to
support it (or to require that it be written in a more complicated
way), that may be a reasonable tradeoff, but there would be a cost.
-jJ
More information about the Python-3000
mailing list