[New-bugs-announce] [issue3735] allow multiple threads to efficiently send the same requests to a processing.Pool without incurring duplicate processing

David Decotigny report at bugs.python.org
Sat Aug 30 01:07:09 CEST 2008


New submission from David Decotigny <com.d2 at free.fr>:

I posted a recipe on ASPN: http://code.activestate.com/recipes/576462/
and Jesse, cheerleader for the inclusion of (multi)processing into
python-core, suggested that it could be interesting to add this feature
to the next pythons.
This recipe is based on version 0.52 of the standalone "processing"
package, and allows to avoid redundancy when multiple threads send the
same job requests to a pool of background worker processes. The recipe
details the why and the how.
Some notes on the implementation, though:
 - There is a "Begin/End workaround" section in the code, which aims at
working around a limitation of processing 0.52 (see comments and
docstring for details). I sent issue #014431 to the issue tracker for
processing on berlios, this would allow to get rid of this workaround
 - Read my comment #2 to the recipe, dealing with my thoughts of using
weak references

----------
components: Library (Lib)
messages: 72170
nosy: DavidDecotigny, jnoller
severity: normal
status: open
title: allow multiple threads to efficiently send the same requests to a processing.Pool without incurring duplicate processing
type: feature request
versions: Python 2.6, Python 3.0

_______________________________________
Python tracker <report at bugs.python.org>
<http://bugs.python.org/issue3735>
_______________________________________


More information about the New-bugs-announce mailing list