[pypy-dev] pre-emptive micro-threads utilizing shared memory message passing?

Kevin Ar18 kevinar18 at hotmail.com
Thu Jul 29 02:59:21 CEST 2010


> I don't know if it can be a solution to your problem but for my Master
> Thesis I'm working on making Stackless Python distributed.

It might be of use.  Thanks for the heads up.  I do have several questions:

1) Is it PyPy's stackless module or Stackless Python (stackless.com)?  Or are they the same module?
2) Do you have a non-https version of the site or one with a publically signed certificate?

P.S. You can send your reply over private email if you want, so as to not bother the list. :)

Date: Wed, 28 Jul 2010 15:32:38 -0400
Subject: Re: [pypy-dev] pre-emptive micro-threads utilizing shared memory 	message passing?
From: glavoie at gmail.com
To: kevinar18 at hotmail.com
CC: pypy-dev at codespeak.net

Hello Kevin,
     I don't know if it can be a solution to your problem but for my Master Thesis I'm working on making Stackless Python distributed. What I did is working but not complete and I'm right now in the process of writing the thesis (in french unfortunately). My code currently works with PyPy's "stackless" module onlyis and use some PyPy specific things. Here's what I added to Stackless:

- Possibility to move tasklets easily (ref_tasklet.move(node_id)). A node is an instance of an interpreter.- Each tasklet has its global namespace (to avoid sharing of data). The state is also easier to move to another interpreter this way. 
- Distributed channels: All requests are known by all nodes using the channel. - Distributed objets: When a reference is sent to a remote node, the object is not copied, a reference is created using PyPy's proxy object space.
- Automated dependency recovery when an object or a tasklet is loaded on another interpreter
With a proper scheduler, many tasklets could be automatically spread in multiple interpreters to use multiple cores or on multiple computers. A bit like the N:M threading model where N lightweight threads/coroutines can be executed on M threads. 

The API is described here in french but it's pretty straightforward:https://w3.mutehq.net/wiki/maitrise/API_DStackless

The code is available here (Just click on the Download link next to the trunk folder):https://w3.mutehq.net/websvn/wildchild/dstackless/trunk/

You need pypy-c built with --stackless. The code is a bit buggy right now though...
 		 	   		  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/pypy-dev/attachments/20100728/00022661/attachment.html>


More information about the Pypy-dev mailing list