"no variable or argument declarations are necessary."
mwm at mired.org
Mon Oct 3 21:27:47 CEST 2005
Steven D'Aprano <steve at REMOVETHIScyber.com.au> writes:
> On Mon, 03 Oct 2005 06:59:04 +0000, Antoon Pardon wrote:
> Declared variables have considerable labour costs, and only marginal
> gains. Since the steps you take to protect against other errors will also
> protect against mistyping variables, declarations of variables is of
> little practical benefit.
As far as I can tell, this is as much hearsay and personal experience
as the alternate claim that not having them costs you lots of
debugging time and errors. If anyone has pointers to real research
into this area (I've heard the TRAK folks did some, but haven't been
able to turn any up), I'd love to hear it.
My gut reaction is that it's a wash. The time taken to declare
variables in well-written code in a well-designed language - meaning
the declarations and use will be close together - isn't all that
great, but neither are the savings.
The other win from declaring variables is that if you compile the code
you can make assumptions about the types of variables and thus save
doing (some of) the type determination at run time. But there are type
inferencing languages and systems - and they've been around since the
70s - that can do that for you without having to declare the
variables, so that doesn't buy you much.
If I'm going to get compiler support for semantic checking like this,
I want it to serious levels. I want function pre/post conditions
checked. I want loop and class invariant checked. I want subsumption
in my inheritance tree. Nuts - I want a complete, well-designed
inheritance tree. Duck typing is great stuff, but if I'm going to be
doing the work to declare everything, I want *everything* that can be
Mike Meyer <mwm at mired.org> http://www.mired.org/home/mwm/
Independent WWW/Perforce/FreeBSD/Unix consultant, email for more information.
More information about the Python-list