[New-bugs-announce] [issue23278] multiprocessing maxtasksperchild=1 + logging = task loss

Nelson Minar report at bugs.python.org
Tue Jan 20 03:02:52 CET 2015

New submission from Nelson Minar:

I have a demonstration of a problem where the combination of multiprocessing with maxtasksperchild=1 and the Python logging library causes tasks to occasionally get lost. The bug might be related to issue 22393 or issue 6721, but I'm not certain. issue 10037 and issue 9205 also might be relevant.  I've attached sample code, it can also be found at https://gist.github.com/NelsonMinar/022794b6a709ea5b7682

My program uses Pool.imap_unordered() to execute 200 tasks. Each worker task writes a log message and sleeps a short time. The master process uses a timeout on next() to log a status message occasionally.

When it works, 200 jobs are completed quickly. When it breaks, roughly 195 of 200 jobs will have completed and next() never raises StopIteration.

If everything logs to logging.getLogger() and maxtasksperchild=1, it usually breaks. It appears that sometimes jobs just get lost and don't complete. We've observed that with maxtasksperchild=1 sometimes a new worker process gets created but no work assigned to it. When that happens the task queue never runs to completion.

If we log straight to stderr or don't set maxtasksperchild, the run completes.

The bug has been observed in Python 2.7.6 and Python 3.4.0 on Ubuntu 14.04

This is a distillation of much more complex application-specific code. Discussion of the bug and original code can be found at


Thank you, Nelson

components: Library (Lib)
files: bug-demo.py
messages: 234336
nosy: nelson
priority: normal
severity: normal
status: open
title: multiprocessing maxtasksperchild=1 + logging = task loss
type: behavior
versions: Python 2.7, Python 3.4
Added file: http://bugs.python.org/file37781/bug-demo.py

Python tracker <report at bugs.python.org>

More information about the New-bugs-announce mailing list