python suitability for large critical run forever apps
bedge at troikanetworks.com
Tue Mar 20 16:53:08 CET 2001
I started using Python for come user CLI stuff. It has worked out so
well that we're thinking about extending it's use in our project.
I am not experienced enough to be able to answer some of the issues that
came up. Does anyone have any thoughts on these:
> > *data* size of each Python interpretter plus however much space
> > each object takes versus C++'s minimal data resource requirements.
> Any rough numbers (shared interpreter code, shared data, unshared data
> and object overhead?)
Multi-threading, and mutual exclusion. Are these available, reliable,
> > Is Python a "toy" language or is it industrial
> > strength ? For instance, does it facilitate recovering from exceptions,
> > and leaving objects in good states (the strong gurantee) ? When you
> > are low on memory, will the fact that more Python operations require
> > dynamically allocated memory make implementations more brittle ? Is it
> > even possible to write a line of Python where you don't have to
> > worry about out-of-memory exceptions (the no-throw gurantee) ?
> > IOW if I'm in a low memory situation, can I perform any operations
> > to even fail gracefully ?
> > Regarding large-scale apps again, is there anything written in Python
> > which "runs forever" ? That is, it doesn't just die to clean up and
> > reset itself ?
More information about the Python-list