I've been poking into reference counting, circular references, etc, trying to understand how the garbage collector works.
In practice, I'm trying to figure out a way to get a couple modules I work with not to leak memory: one is an extension to wxPython that I wrote, that creates a circular reference involving wx objects -- these, as is often the case with wrappers, have a custom __del__ method.
The other is the python netcdf4 lib -- which has a circular reference built in, so that Datasets (which map to a file on disk) know about Variables (which map to data arrays within the file) and vice versa. We tried using weak references here, but it turns out that some users like to work with Variables without a reference to the Dataset, so that didn't work.
Anyway, in my poking around, I think I understand that the gc won't delete "unreachable" objects if the have a custom __del__ method, because __del__ methods often handle resources that may depend on other objects, and bad (really bad) things can happen if they are deleted out of order, and the gc has no idea what order to delete.
However -- some __del__ methods are perfectly safe regardless of delete order.
So it would be nice to be able to somehow let the gc know that a particular object is safe to delete at any time.
Is there a technical reason this can't be done? It seems something as simple as a __delete_in_any_order__ attribute would do it. Yes, this is dangerous, but authors of such extension objects need to know what they are doing anyway.
Is there a technical complication I'm not thinking of?
It seems if we could pull this off, we could eliminate one lasting ugly wart in python memory management.