Python's "only one way to do it" philosophy isn't good?

Douglas Alan doug at alum.mit.edu
Thu Jun 28 05:37:18 CEST 2007


"Chris Mellon" <arkanes at gmail.com> writes:

> On 6/27/07, Douglas Alan <doug at alum.mit.edu> wrote:

>> The C++ folks feel so strongly about this, that they refuse to provide
>> "finally", and insist instead that you use destructors and RAII to do
>> resource deallocation.  Personally, I think that's taking things a bit
>> too far, but I'd rather it be that way than lose the usefulness of
>> destructors and have to use "when" or "finally" to explicitly
>> deallocate resources.

> This totally misrepresents the case. The with statement and the
> context manager is a superset of the RAII functionality.

No, it isn't.  C++ allows you to define smart pointers (one of many
RAII techniques), which can use refcounting or other tracking
techniques.  Refcounting smart pointers are part of Boost and have
made it into TR1, which means they're on track to be included in the
next standard library.  One need not have waited for Boost, as they can
be implemented in about a page of code.

The standard library also has auto_ptr, which is a different sort of
smart pointer, which allows for somewhat fancier RAII than
scope-based.

> It doesn't overload object lifetimes, rather it makes the intent
> (code execution upon entrance and exit of a block) explicit.

But I don't typically wish for this sort of intent to be made
explicit.  TMI!  I used "with" for *many* years in Lisp, since this is
how non-memory resource deallocation has been dealt with in Lisp since
the dawn of time.  I can tell you from many years of experience that
relying on Python's refcounter is superior.

Shouldn't you be happy that there's something I like more about Python
than Lisp?

> Nobody in their right mind has ever tried to get rid of explicit
> resource management - explicit resource management is exactly what you
> do every time you create an object, or you use RAII, or you open a
> file.

This just isn't true.  For many years I have not had to explicitly
close files in Python.  Nor have I had to do so in C++.  They have
been closed for me implicitly.  "With" is not implicit -- or at least
not nearly as implicit as was previous practice in Python, or as is
current practice in C++.

> *Manual* memory management, where the tracking of references and
> scopes is placed upon the programmer, is what people are trying to
> get rid of and the with statement contributes to that goal, it
> doesn't detract from it.

As far as I am concerned, memory is just one resource amongst many,
and the programmer's life should be made easier in dealing with all
such resources.

> Before the with statement, you could do the same thing but you
> needed nested try/finally blocks

No, you didn't -- you could just encapsulate the resource acquisition
into an object and allow the destructor to deallocate the resource.

> RAII is a good technique, but don't get caught up on the
> implementation details.

I'm not -- I'm caught up in the loss of power and elegance that will
be caused by deprecating the use of destructors for resource
deallocation.

> The with statement does exactly the same thing, but is actually
> superior because
>
> a) It doesn't tie the resource managment to object creation. This
> means you can use, for example, with lock: instead of the C++ style
> Locker(lock)

I know all about "with".  As I mentioned above, Lisp has had it since
the dawn of time.  And I have nothing against it, since it is at times
quite useful.  I'm just dismayed at the idea of deprecating reliance
on destructors in favor of "with" for the majority of cases when the
destructor usage works well and is more elegant.

> b) You can tell whether you exited with an exception, and what that
> exception is, so you can take different actions based on error
> conditions vs expected exit. This is a significant benefit, it
> allows the application of context managers to cases where RAII is
> weak. For example, controlling transactions.

Yes, for the case where you might want to do fancy handling of
exceptions raised during resource deallocation, then "when" is
superior, which is why it is good to have in addition to the
traditional Python mechanism, not as a replacement for it.

>> Right, but that doesn't mean that 99.9% of the time, the programmer
>> can't immediately tell that cycles aren't going to be an issue.

> They can occur in the most bizarre and unexpected places. To the point
> where I suspect that the reality is simply that you never noticed your
> cycles, not that they didn't exist.

Purify tells me that I know more about the behavior of my code than
you do: I've *never* had any memory leaks in large C++ programs that
used refcounted smart pointers that were caused by cycles in my data
structures that I didn't know about.

> And if you think you won't need it because python will get "real" GC
> you're very confused about what GC does and how.

Ummm, I know all about real GC, and I'm quite aware than Python has
had it for  quite some time now.  (Though the implementation is rather
different last I checked than it would be for a language that didn't
also have refcounted GC.)

> A generic threadsafe smart pointer, in fact, is very nearly a GIL.

And how's that?  I should think that modern architectures would have
an efficient way of adding and subtracting from an int atomically.  If
they don't, I have a hard time seeing how *any* multi-threaded
applications are going to be able to make good use of multiple
processors.

> Get cracking then. You're hardly the first person to say this.
> However, of the people who say it, hardly anyone actually produces
> any code and the only person I know of who did dropped it when
> performance went through the floor. Maybe you can do better.

I really have no desire to code in C, thank you.  I'd rather be coding
in Python.  (Hence my [idle] desire for macros in Python, so that I
could do even more of my work in Python.)

> There's no particular reason why Lisp is any better for AI research
> than anything.

Yes, there is.  It's a very flexible language that can adapt to the
needs of projects that need to push the boundaries of what computer
programmers typically do.

> I'm not familiar with the TIOBE metric, but I can pretty much
> guarantee that regardless of what it says there is far more COBOL
> code in the wild, being actively maintained (or at least babysat)
> than there is lisp code.

I'm agree that there is cedrtainly much more Cobol code being
maintained than there is Lisp code, but that doesn't mean that there
are more Cobol programmers writing new code than there are Lisp
programmers writing new code.  A project would have to be run by a
madman to begin a new project in Cobol.

>> Re Lisp, though, there used to be a joke (which turned out to be
>> false), which went, "I don't know what the most popular programming
>> language will be in 20 years, but it will be called 'Fortran'".  In
>> reality, I don't know what the most popular language will be called 20
>> years from now, but it will *be* Lisp.

> And everyone who still uses the language actually called Lisp will
> continue to explain how it isn't a "real" lisp for a laundry list of
> reasons that nobody who gets work done actually cares about.

And where are you getting this from?  I don't know anyone who claims
that any commonly used dialect of Lisp isn't *really* Lisp.

|>oug



More information about the Python-list mailing list