[stdlib-sig] Concurrent/Jython (was: futures - a new package for asynchronous execution)

Jesse Noller jnoller at gmail.com
Sat Nov 7 19:40:26 CET 2009

On Sat, Nov 7, 2009 at 11:37 AM, Frank Wierzbicki <fwierzbicki at gmail.com> wrote:

> If it can be done in pure Python I'd certainly be up for taking a a
> crack at such a patch.  If it involves significant work with C and
> threading it might be a little out of my scope.  If pure python is
> out, I may end up implementing those parts missing in threading.py in
> Java for Jython, and then circling back to see if doing it in C for
> CPython makes sense.
> -Frank

Figured I'd start a new thread rather than overloading the existing one.

If we were to seriously go down the path of building out a concurrent
package for Python's stdlib, I think 99% of the work would be in pure
python. Right now multiprocessing and threading expose the primitives
needed to do most of the work without diving into C.

For example, the pool.py module [1] multiprocessing has could easily
have threads swapped in - this would probably change the API to look
like this:


class Pool(object):
    Class which supports an async version of the `apply()` builtin
    Process = Process

    def __init__(self, processes=None, initializer=None, initargs=()):


class Pool(object):
    Class which supports an async version of the `apply()` builtin
    worker = threading.Thread

    def __init__(self, worker_type=multiprocessing.Process,
workers=None, initializer=None, initargs=()):

Theres some other internals which would need to be adjusted, but just
modifying it to support passing in the Worker type to use would make
it easy to swap it for threads vs. process. My person preference would
be to have the above hidden with _ and two classes:

class ThreadPool
class ProcessPool

Which just called the hidden _Pool with the worker type, but that's a
smell-thing. I think explicitness in APIs dealing with threads and
processes is preferable than highly flexible generic APIs (the
restriction on multiprocessing being "objects must be pickleable").

Then again, passing in the worker type means less work for
IronPython/Jython - users could just be told "just accept the default
thread worker type" - they would only need to change process-based
code to remove the worker_type argument. Or Jython could swap it under
the covers.

So, my pony list would look something like:

1> add concurrent
2> Put futures in concurrent
3> Refactor multiprocessing.pool into concurrent, add deprecation
notes to the multiprocessing APIs in the docs

A simple producer/consumer implementation could also be added in here
too (one day).

This work should only target Python 3, imho.


[1]: http://svn.python.org/view/python/trunk/Lib/multiprocessing/pool.py?revision=74023&view=markup

More information about the stdlib-sig mailing list