automatic reclaiminig of limited resources (was Re: [Python-Dev] Product iteration)
Sun, 30 Jul 2000 14:55:09 -0400
> Only if the "constructor().method()" idiom is not contained in
> a loop. Ancient Unix systems allowed only 20 files open at the
> same time. Although this has been increased to 120 or so in
> the mean time, you will easily ran out of file descriptors with
> the simple throw-away script posted in my first rant in this
> I believe scripts like this one are very common in the industry,
> since this idioms were advertised by several books (I still have
> to look up Martin v.Loewis "Das Pythonbuch", but IRC it contained
> subsection comparing Pythons garbage collection with Java GC and
> advertised the advantages of ref count GC)
You don't have to dig that deep: they're also common in the standard
distribution, including at least two of mine (checkappend.py and
tabnanny.py). There's a long copy/paste/modify tradition of not bothering
to close file objects in multi-file tools, dating back at least to Guido's
1991 eptags.py <wink>.
These aren't hard to "fix", but that isn't the point. A lot of code out
there would start failing in data- and platform-dependent ways if CPython
stopped cleaning up dead file objects "real soon" after they die, and
there's no easy way to identify in advance which code that may be. It would
piss off a lot of people!
But I don't know why we're arguing about it. Nobody (AFAIK) has announced
plans to take refcounting out of CPython, but that you can't *rely* on
refcounting across Python implementations is ancient news (and a reality in
practice since JPython).
Guido certainly can't *require* that all implementations use refcounting!
The Reference Manual has been clear about this for years, and other
implementations already rely on what they were promised there: it's already
too late to stop this from being a basis on which implementations will
as-if-microsoft-had-the-resources-to-take-on-beopen-anyway<wink>-ly y'rs -