Python's "only one way to do it" philosophy isn't good?

Chris Mellon arkanes at gmail.com
Wed Jun 27 11:35:54 EDT 2007


On 6/27/07, Douglas Alan <doug at alum.mit.edu> wrote:
> Paul Rubin <http://phr.cx@NOSPAM.invalid> writes:
>

> Gee, that's back to the future with 1975 Lisp technology.  Destructors
> are a much better model for dealing with such things (see not *all*
> good ideas come from Lisp -- a few come from C++) and I am dismayed
> that Python is deprecating their use in favor of explicit resource
> management.  Explicit resource management means needlessly verbose
> code and more opportunity for resource leaks.
>
> The C++ folks feel so strongly about this, that they refuse to provide
> "finally", and insist instead that you use destructors and RAII to do
> resource deallocation.  Personally, I think that's taking things a bit
> too far, but I'd rather it be that way than lose the usefulness of
> destructors and have to use "when" or "finally" to explicitly
> deallocate resources.
>

This totally misrepresents the case. The with statement and the
context manager is a superset of the RAII functionality. It doesn't
overload object lifetimes, rather it makes the intent (code execution
upon entrance and exit of a block) explicit. You use it in almost
exactly the same way you use RAII in C++ (creating new blocks as you
need new scopes), and it performs exactly the same function.

Nobody in their right mind has ever tried to get rid of explicit
resource management - explicit resource management is exactly what you
do every time you create an object, or you use RAII, or you open a
file. *Manual* memory management, where the tracking of references and
scopes is placed upon the programmer, is what people are trying to get
rid of and the with statement contributes to that goal, it doesn't
detract from it. Before the with statement, you could do the same
thing but you needed nested try/finally blocks and you had to
carefully keep track of the scopes, order of object creation, which
objects were created, all that. The with statement removes the manual,
error prone work from that and lets you more easily write your intent
- which is *precisely* explicit resource management.

RAII is a good technique, but don't get caught up on the
implementation details. The fact that it's implemented via stack
objects with ctors and dtors is a red herring. The significant feature
is that it's you've got explicit, predictable resource management with
(and this is the important bit) a guarantee that code will be called
in all cases of scope exit.

The with statement does exactly the same thing, but is actually
superior because

a) It doesn't tie the resource managment to object creation. This
means you can use, for example, with lock: instead of the C++ style
Locker(lock)

and

b) You can tell whether you exited with an exception, and what that
exception is, so you can take different actions based on error
conditions vs expected exit. This is a significant benefit, it allows
the application of context managers to cases where RAII is weak. For
example, controlling transactions.

> > Python object lifetimes are in fact NOT predictable because the ref
> > counting doesn't (and can't) pick up cyclic structure.
>
> Right, but that doesn't mean that 99.9% of the time, the programmer
> can't immediately tell that cycles aren't going to be an issue.
>
> I love having a *real* garbage collector, but I've also dealt with C++
> programs that are 100,000+ lines long and I wrote plenty of Python
> code before it had a real garbage collector, and I never had any
> problem with cyclic data structures causing leaks.  Cycles are really
> not all that common, and when they do occur, it's usually not very
> difficult to figure out where to add a few lines to a destructor to
> break the cycle.
>

They can occur in the most bizarre and unexpected places. To the point
where I suspect that the reality is simply that you never noticed your
cycles, not that they didn't exist.

> > And the refcounts are a performance pig in multithreaded code,
> > because of how often they have to be incremented and updated.
>
> I'm willing to pay the performance penalty to have the advantage of
> not having to use constructs like "when".
>

"with". And if you think you won't need it because python will get
"real" GC you're very confused about what GC does and how.

> Also, I'm not convinced that it has to be a huge performance hit.
> Some Lisp implementations had a 1,2,3, many (or something like that)
> reference-counter for reclaiming short-lived objects.  This bypassed
> the real GC and was considered a performance optimization.  (It was
> probably on a Lisp Machine, though, where they had special hardware to
> help.)
>
> > That's why CPython has the notorious GIL (a giant lock around the
> > whole interpreter that stops more than one interpreter thread from
> > being active at a time), because putting locks on the refcounts
> > (someone tried in the late 90's) to allow multi-cpu parallelism
> > slows the interpreter to a crawl.
>
> All due to the ref-counter?  I find this really hard to believe.
> People write multi-threaded code all the time in C++ and also use
> smart pointers at the same time.  I'm sure they have to be a bit
> careful, but they certainly don't require a GIL.
>

A generic threadsafe smart pointer, in fact, is very nearly a GIL. The
GIL isn't just for refcounting, though, it's also about access to the
python interpreters internal state.

For the record, the vast majority of multithreaded C++ code is
incorrect or inefficient or both.

> I *would* believe that getting rid of the GIL will require some
> massive hacking on the Python interpreter, though, and when doing that
> it may be significantly easier to switch to having only a real GC than
> having two different kinds of automatic memory management.
>
> I vote, though, for putting in that extra work -- compatibility with
> Jython be damned.
>

Get cracking then. You're hardly the first person to say this.
However, of the people who say it, hardly anyone actually produces any
code and the only person I know of who did dropped it when performance
went through the floor. Maybe you can do better.

> > Lisp may always be around in some tiny niche but its use as a
> > large-scale systems development language has stopped making sense.
>
> It still makes perfect sense for AI research.  I'm not sure that
> Lisp's market share counts as "tiny".  It's certainly not huge, at
> only 0.669% according to the TIOBE metric, but that's still the 15th
> most popular language and ahead of Cobol, Fortran, Matlab, IDL, R, and
> many other languages that are still in wide use.  (Cobol is probably
> still around for legacy reasons, but that's not true for the other
> languages I mentioned.)
>

There's no particular reason why Lisp is any better for AI research
than anything. I'm not familiar with the TIOBE metric, but I can
pretty much guarantee that regardless of what it says there is far
more COBOL code in the wild, being actively maintained (or at least
babysat) than there is lisp code.

> > If you want to see something really pathetic, hang out on
> > comp.lang.forth sometime.  It's just amazing how unaware the
> > inhabitants there are of how irrelevant their language has become.
> > Lisp isn't that far gone yet, but it's getting more and more like that.
>
> Forth, eh.  A chaque son gout, but I'd be willing to bet that most
> Forth hackers don't believe that Forth is going to make a huge
> resurgence and take over the world.  And it still has something of a
> place as the core of Postscript and maybe in some embedded systems.
>
> Re Lisp, though, there used to be a joke (which turned out to be
> false), which went, "I don't know what the most popular programming
> language will be in 20 years, but it will be called 'Fortran'".  In
> reality, I don't know what the most popular language will be called 20
> years from now, but it will *be* Lisp.
>

And everyone who still uses the language actually called Lisp will
continue to explain how it isn't a "real" lisp for a laundry list of
reasons that nobody who gets work done actually cares about.

> |>oug
> --
> http://mail.python.org/mailman/listinfo/python-list
>



More information about the Python-list mailing list