On Thu, 2007-09-20 at 11:26 -0700, Ed Suominen wrote:
Here I go again touting my AsynQueue package. Sorry, but it just seems to be a very appropriate solution to many of the problems being raised recently.
Well, to be fair it's an excellent bit of code.
I've recently added a "processworker" module that does just what it sounds like. You can now queue up jobs to be run on a separate Python interpreter. If the interpreter crashes due to a segfault or anything else, you just construct a new worker instance and attach it to the queue, and the jobs continue merrily along.
Interesting. I see for the process worker jobs you pass in a python string. One of the conceptual difficulties I've always had with creating a farm of subprocesses is ensuring the module import status would be valid, so that you could pass a function and class instances across the pickle boundary (or whatever) Did you consider this approach?
In addition to deferred-based priority queuing, the queue object has powerful capabilities for hiring and firing workers, letting workers resign when they can't perform their duties any more, assigning tasks to appropriate workers, and re-assigning tasks from terminated workers.
See http://tinyurl.com/349k2o (http://foss.eepatents.com/AsynQueue/browser/projects/AsynQueue/trunk/asynque...)
By the way, AsynQueue (without the new processworker stuff) is now available in Debian testing, thanks to efforts of Eric Evans. Just apt-get install python-asynqueue.
Best regards, Ed
_______________________________________________ Twisted-Python mailing list Twisted-Python@twistedmatrix.com http://twistedmatrix.com/cgi-bin/mailman/listinfo/twisted-python