[issue8426] multiprocessing.Queue fails to get() very large objects

Vinay Sajip report at bugs.python.org
Sat Aug 27 17:13:03 CEST 2011


Vinay Sajip <vinay_sajip at yahoo.co.uk> added the comment:

I think it's just a documentation issue. The problem with documenting limits is that they are system-specific and, even if the current limits that Charles-François has mentioned are documented, these could become outdated. Perhaps a suggestion could be added to the documentation:

"Avoid sending very large amounts of data via queues, as you could come up against system-dependent limits according to the operating system and whether pipes or sockets are used. You could consider an alternative strategy, such as writing large data blocks to temporary files and sending just the temporary file names via queues, relying on the consumer to delete the temporary files after processing."

----------
nosy: +vinay.sajip

_______________________________________
Python tracker <report at bugs.python.org>
<http://bugs.python.org/issue8426>
_______________________________________


More information about the Python-bugs-list mailing list