REPOST: Re: Python and Ruby: a comparison

Edward Diener eldiener at earthlink.net
Wed Jan 2 17:12:21 EST 2002


Alex Martelli wrote:

> "Edward Diener" <eldiener at earthlink.net> wrote in message
> news:3C322751.9070404 at earthlink.net...


>>I still feel this leads to language subsets and people who learn a
>>computer language and only use, and know, a portion of that language. It
>>
> 
> I think your feelings are misleading you on this subject.  Language
> subsets &c are inevitably caused by languages being "larger" than what
> is needed by some substantial group of people, and happen for every
> language with reasonably-long history and wide use.
> 
> Some languages just accept that: they AIM to be big and cover a lot
> of bases, thus subsetting is taken in stride.  C++, Perl, O'CAML, Common
> Lisp: nobody has 100% of them in their everyday "active vocabulary",
> and very few users indeed reach 100% even in "passive vocabulary"
> (portion of the language that is understood if met in code being
> read, even though not normally used in code produced by the person).
> 
> But even languages which aim to be small, unless they are highly
> restricted, end up subsetted anyway.  Few Haskell users routinely
> write their own monads: much can be done in Haskell without the
> very substantial conceptual jump needed to fully grasp monads and
> author your own, thus, enter subsetting.  Very few Python users
> routinely write their own metaclasses: ditto ditto.  It's anything
> but rare even for a highly productive and competent Python coder
> to have problems grasping metaclasses even in "passive vocabulary"
> terms.  And, why not?  The average Pythonista does not need them,
> they're a hard step up on the conceptual ladder, so WHY bemoan their
> lack from either active or passive vocabularies of Pythonistas?


You make a cogent argument and I generally agree that many programmers 
do not use a large part of their respective programming languages.


> 
> Subsetting becomes even more prevalent as soon as you accept some
> "standard library" features as part of a language.  If you never
> do (e.g.) CGI web programming, why should you care about that
> subset of a language, or its libraries, that exists strictly for
> the benefit of CGI authors?  Yet the language (cum libraries) is
> better for having those modules too, widening its usability and
> applicability (and similarly for monads, metaclasses, etc etc).


I make the distinction between libraries and language features. There is 
a totally different mindset to not using libraries than to not using 
features, even though both "not using" parts may be a logical result of 
the programmer's project(s) and are certainly valid in many given 
situations.


> 
> It's certainly mistaken to think that a totally new phenomenon
> (the introduction of .NET Framework and its CLR) "leads to" an
> old, existing, and inevitable one.  Post hoc does not necessarily
> mean propter hoc, but the reverse implication IS indeed necessary
> (as long as time's arrows don't start flipping around randomly,
> causes MUST come before effects).


The scenario I have experienced most goes like this:

Programmer A, let's say that is me, uses some feature of a language 
which has been added to the language to make it easier and more elegant 
to use. The language still supports earlier constructs which may support 
the same functionality but in a more contorted way which in general 
impedes good design and ease of coding somewhat ( however one may define 
good design ). Programmer B has been nurtured on some subset of the same 
language which does not support the added feature or, if it does so, 
deprecates it in favor of other, more homogenized goals. Programmer A 
works with Programmer B and has to design and implement with Programmer 
B. When Programmer A sees the way Programmer B uses some element of the 
language and tries to point out a more advanced way of doing something, 
Programmer B says either 1) That construct is not a part of the language 
, or 2) I have seen that but Software Corporation X doesn't implement 
that feature in their subset so I certainly don't have any time or 
inclination to learn to use it, or 3) Yes, I know all about it but I 
have always done things this way because very few implementations 
support this new feature so I haven't bothered to learn it.

> 
> 
>>has already happened before .NET was even created with VC++, where a
>>great many programmers who know only MS's subset of C++ and even some of
>>its incorrect syntax, assume they are using C++ effectively and
>>correctly and often they are not.
>>
> 
> Extremely similar phenomena prevailed even before Mr Gates knew how
> to tell a bit from a byte: most scientists I met in the '70s, who
> thought they knew and were using Fortran effectively and correctly,
> were just as sadly mistaken -- they actually knew and used (and at
> times with far from optimal efficiency) some specific dialect of the
> Fortran language as supplied by, e.g., Digital Equipment, or IBM, or
> some other purveyor yet (or rather, almost invariably, some specific
> SUBSET of that specific dialect; few people who learned Fortran on
> boxes where you could write, e.g., a literal constant as 'CIAO', ever
> knew that the standard way of writing it was 4HCIAO, or could easily
> recognize the latter form; just as one example...).


I am not blaming MS personally for this but huge and successful 
companies carry much weight and very often do not have the technical 
merit of software ideas in mind when they promote their solutions to 
programming problems.


> 
> 
> 
>>Managed C++ in .NET is also an
>>abortion of C++ which only a computer programming masochist could love.
>>
> 
> The ability to play havoc with pointers and memory management in C++,
> while inevitably and inextricably part of that language, is hardly a
> plus for a vast majority of the application uses to which C++ is (maybe
> inappropriately) put on a daily basis.  Doing away with that is the
> single highest factor in productivity enhancement when moving, e.g., to
> Java.


The language which you ( and so many others ) identify as C++, with its 
ability to "play havoc with pointers and memory management" has been 
superceded in the past 7 years by a language whose modern constructs 
make it nearly impossible to have these problems. The fact that many 
so-called C++ programmers do not want to use these very simple and 
elegant constructs, in favor of more error-prone techniques still 
supported by the lnaguage for backward compatibility, is not proof of 
these problems still existing for practiced programmers in the language. 
Java's doing away with those problems is a red herring, since these 
problems no longer exist for professional C++ programmers.

Nor am I sold on productivity enhancement when moving to Java simply 
because you claim it. And I am a Java programmer also. Productivity is 
not just the spewing forth of the maximum lines of code in the minimum 
time. But I am sure you know that.


> 
> Yet the loss of templates (generic programming) hurts productivity most
> grievously (when imposed upon programmers who have learned to make good
> use of templates, of course).  I believe that, today still, "Managed
> C++" is the only .NET language that lets you use templates (as the
> architects of .NET seem to share with those of Java a horrible blind spot
> regarding generic programming -- I keep hearing that Java is due to gain
> Generic Programming features any day now, but I still can't see them in
> Javasoft's released SDKs).


I am guessing that the "generic programming" of Java and C# will be 
created to allow flexibly specified algorithms to operate against any 
"collection" or parts of a "collection" of objects.


> 
> Much as I might prefer "C# with templates" or whatever, therefore, I'm
> quite liable to choose "Managed C++" today if tasked to develop some
> largish subsystem in and for .NET (given that no "Python .NET" is at
> hand: Python programming, as it hinges on signature based polymorphism,
> just like templates, gives similar productivity advantages to templates
> in even smoother and more general ways, of course).


Save me ! No, I will be using C# ( aka Microsoft's version of Java ) if 
I do .NET and not some managed abortion.


> 
> 
>>I anticipate this happening with most .NET versions of languages and
>>that many programmers of these languages will only know and use the .NET
>>subset and not the full language.
>>
> 
> If most programmers will indeed start eschewing "unmanaged" memory access
> for application-level programming, this will no doubt reduce bugs and
> increase productivity.  But giving up on extremely low-level features
> (unsuitable for application programming needs) is something that basically
> only touches on C++ and similar system-level languages, since application
> oriented languages don't offer those anyway (not the sensible ones!-).


Being "sensible" in programming is not my inclination. I prefer 
creativity at its expense. That is why I heavily prefer Python over 
sensible and pragmatic Perl or safe and limited Javascript ( or VBScript  ).


> 
> Apart from "unmanaged memory access" issues, I totally disagree with your
> thesis.  If a given programmer or programming shop wants the semantics
> of C#/VB.NET, they will mostly be using C# or VB.NET depending on syntax
> sugar tastes.  When somebody goes to the trouble of using the .NET versions
> of, say, APL, or Haskell, or Mercury, it definitely will NOT be just in
> order to get peculiar syntax sugar on the same semantics: rather, it will
> be because SOME supplementary features of those respective languages are
> of interest (may be arry operations, typeclasses, backtracking
> respectively).
> 
> It will therefore be an EXTREMELY RARE phenomenon for programmers to be
> using "strange" (non C#/VB.NET) languages in the .NET versions and "only
> know and use the .NET subset", assuming that, by the latter, you mean the
> semantics supported "at the interface between separate components" by CLR.


I hope you are right, but I anticipate too many full-blooded languages 
becoming watered down and weak by bathing in the .NET stream. Have I not 
already seen posts on this NG about Python .NET ?

Eddie




More information about the Python-list mailing list