[New-bugs-announce] [issue1241] subprocess.py stdout of childprocess always buffered.

Jason Kim report at bugs.python.org
Fri Oct 5 22:09:30 CEST 2007

New submission from Jason Kim:


I am currently using subprocess.py (2.4.4 and above) to try to have a
portable way of running a sub task in linux and windows.

I ran into a strange problem - a program runs and is "timed out"
but the the subprocess's stdout and stderr are not fully "grabbed"
So I've been trying various ways to force "flush" the stdout and stderr
of the child process so that at least partial results can be saved.

The "problem" app being spawned off is very simple:
#include <stdio.h>

int main() {
  int i = 0;
  for (i = 0; i < 1000; ++i) {
    printf("STDOUT boo %d\n",i);
    fprintf(stdout,"STDOUT sleeping %d\n",i);
    fprintf(stderr,"STDERR sleeping %d\n",i);


i.e. it just dumps its output to both stdout and stderr. The issue that
I am seeing is that no matter what options I tried to place for
subprocess(), the ONLY output I see from the executed process are

"STDERR sleeping " lines, UNLESS I uncomment the fflush(stdout) line in
the application.

Executing the script with python -u doesn't seem to help either.
Now, if the task completes normally, then I am able to grab the entire
stdout and stderr produced by the subprocess. The issue is that I can't
seem to grab the partial output for stdout, and there does not seem to
be a way to make the file descriptors returned by pipe() to be unbuffered.

So the question is: what is the preferred method of forcing the pipe()
file descriptors created by subprocess.__init__() to be fully unbuffered?

Second, is there a better way of doing this?
i.e. a portable way to spawn off a task, with an optional timeout, grab
any partial results from the task's stdout and stderr, and grab the
return code from the child task?

Any hints and advice will be greatly appreciated.

Thank you.

The relevant snippet of python code is:

import threading
from signal import *
from subprocess import *
import time
import string
import copy
import re
import sys
import os
from glob import glob
from os import path
import thread

class task_wrapper():
   def run(s):
      if s.timeout > 0:
        #print "starting timer for ",s.timeout
        s.task_timer = threading.Timer(s.timeout, task_wrapper.cleanup, [s])
      s.task_start_time = time.time()
      s.task_end_time = s.task_start_time
      s.subtask=Popen(s.cmd, bufsize=0, env=s.env, stdout=PIPE, stderr=PIPE)
      s.task_out, s.task_err = s.subtask.communicate()

  def kill(s, subtask):
    """ attempts a portable way to kill things
    First, flush the buffer
    print "killing", subtask.pid
    print "s.subtask.stdout.fileno()=",s.subtask.stdout.fileno()
    print "s.subtask.stderr.fileno()=",s.subtask.stderr.fileno()
    if os.name == "posix":
      os.kill(subtask.pid, SIGKILL)
    elif os.name == "nt":
      import win32api
      win32api.TerminateProcess(subtask._handle ,9)

  def cleanup(s, mode="TIMEOUT"):
    if s.task_result == None:
    if mode == "TIMEOUT":
       s.msg( """ Uhoh, subtask took too long""")

messages: 56247
nosy: jason.w.kim
severity: normal
status: open
title: subprocess.py stdout of childprocess always buffered.
type: behavior
versions: Python 2.4, Python 2.5

Tracker <report at bugs.python.org>

More information about the New-bugs-announce mailing list