Python vs. C#

Heiko Wundram heikowu at ceosg.de
Tue Aug 12 12:17:55 EDT 2003


On Tue, 2003-08-12 at 06:45, Brandon J. Van Every wrote:
> Yes your rendering code is nice looking.  Is it fast?  Were you working on a
> problem where it needed to be fast?  I haven't been using C++ out of love.
> I've been using it for performance.  And I don't think "what I can do in 72
> hours" is a valid test for big industrial system architectures.  Is your
> code going to hold up in the face of dozens of programmers banging on it,
> and hundreds or thousands of programmers using it?  And still be fast?

Yes, I was working on problems where the rendering code needed to be
fast. Doing an import psyco solved all my earlier fears about lagging
behind in performance terms against the people I had to compete with.
And, is having an interpreted language really such a major concern in
the age of computers operating at 2000+ Mhz? The final goal of the
project was designing a 3D-ego shooter with the engine we had written.
And it didn't run overly fast, but fast enough on the machine I tested
it on.

If I had had the time, I'd have worked most of the rendering code down
to C, which I would've wrapped with SWIG or the like, but I was
surprised how fast it even ran using plain Python with psyco. And I
don't really want to express game logic, such as computer player AI, or
the like, in C++. Do you?

(genetic algorithms are plain cool here...)

Just btw., my code help up to other programmers banging on it, and also
held up to larger usage. Just because you have static typing doesn't
mean that the code works together with an "evil programmer" (casting is
lots of fun...). And because my code was understandable, other
programmers who used it could read the code before using my library.
This is impossible to do with an overly complex C++ library, in my eyes.

> It is?  Then I'm confused, because around here people keep talking about the
> beauty of avoiding types.

static typing = needing to declare types for all slots
(slots==variables).
dynamic typing = a slot can take up an object of any type (python's
slots only take up references, everything is a reference).

strong typing = types are checked on usage. If you don't use a correct
type (like with an expression (float*)&"1234" in C, which makes no sense
at all), you get an exception.
weak typing = types aren't checked on conversion, so things like
(float*)&"1234" work.

Python is certainly strongly typed, but unlike C, it is dynamically
typed. People also speak of static weak typing in C, as you can cast
anything into anything else (without checks), what you can't do in
Python. C++ is (was) the same, at least when I learnt it.

> What you are saying is Python excels at prototyping, where speed and
> flexibility are paramount.  You are not saying that Python excels as a big
> system architecture language, where stability and safety are paramount.

Isn't a renderer and a game engine a big system architecture? I didn't
just prototype it in Python, I could offer a complete system, which was
very extensible. I could've never managed this in any other langauge
without the usual headaches.

> That "except" is, like, 1/2 to 2/3 of industry.  I think you Python guys
> need to wake up that there's a much bigger, scarier, and more threatening
> world out there than the UNIX world of "engineering done right."  That world
> is also not sitting still, it's on the move.  For interoperability of
> langauges, UNIX has nothing to offer like the .NET Framework.  DirectX is
> now ahead of OpenGL on vertex/pixel shader API capability and stability.
> The black hole, if not taken seriously, will swallow you.  Either that or
> you're forced into "clone and conquer" in reverse.

I don't really know why you compare OpenGL to DirectX, they are just
plain API specifications. Maybe on your Windows Box DirectX is faster
than OpenGL (and offers support for more tasks), but this is a driver
vendor problem. Most graphics drivers on Windows are developed with
DirectX in mind, not with OpenGL. NVidia is IMHO an honorable exception,
as their OpenGL code (sometimes) works faster than DirectX.

And to a sidenote: Unix has always had a .NET Framework. The libraries
.NET tries to standardize have been available on Unix since a long time.
And I don't especially like the thought that I have to choose a single
Framework to do all I need, as this only makes me dependant on the
original implementor (Microsoft in this case). And if you need a
"portable" virtual machine, Python, Java, and Perl (just to name a few)
have offered this since ages. .NET isn't such a big innovation as it
claims to be...

Heiko.






More information about the Python-list mailing list