[issue19575] subprocess.Popen with multiple threads: Redirected stdout/stderr files still open after process close

Bernt Røskar Brenna report at bugs.python.org
Wed Nov 13 22:02:53 CET 2013


New submission from Bernt Røskar Brenna:

Running the following task using concurrent.futures.ThreadPoolExecutor works with max_workers == 1 and fails when max_workers > 1 :

def task():
    dirname = tempfile.mkdtemp()
    f_w =  open(os.path.join(dirname, "stdout.txt"), "w")
    f_r = open(os.path.join(dirname, "stdout.txt"), "r")
    e_w =  open(os.path.join(dirname, "stderr.txt"), "w")
    e_r = open(os.path.join(dirname, "stderr.txt"), "r")

    with subprocess.Popen("dir", shell=True, stdout=f_w, stderr=e_w) as p:
        pass

    f_w.close()
    f_r.close()
    e_w.close()
    e_r.close()

    shutil.rmtree(dirname)

The exception is raised by shutil.rmtree when max_workers > 1: "The process cannot access the file because it is being used by another process"

See also this Stack Overflow question about what seems to bee a similar problem: http://stackoverflow.com/questions/15966418/python-popen-on-windows-with-multithreading-cant-delete-stdout-stderr-logs  The discussion on SO indicates that this might be an XP problem only.

The attached file reproduces the problem on my Windows XP VM.

----------
components: Library (Lib), Windows
files: repro.py
messages: 202779
nosy: Bernt.Røskar.Brenna
priority: normal
severity: normal
status: open
title: subprocess.Popen with multiple threads: Redirected stdout/stderr files still open after process close
versions: Python 3.3
Added file: http://bugs.python.org/file32602/repro.py

_______________________________________
Python tracker <report at bugs.python.org>
<http://bugs.python.org/issue19575>
_______________________________________


More information about the Python-bugs-list mailing list