multiple processes, private working directories
kar1107 at gmail.com
Thu Sep 25 05:14:43 CEST 2008
On Sep 24, 6:27 pm, Tim Arnold <a_j... at bellsouth.net> wrote:
> I have a bunch of processes to run and each one needs its own working
> directory. I'd also like to know when all of the processes are
> (1) First thought was threads, until I saw that os.chdir was process-
> (2) Next thought was fork, but I don't know how to signal when each
> child is
> (3) Current thought is to break the process from a method into a
> script; call the script in separate threads. This is the only way I
> can see
> to give each process a separate dir (external process fixes that), and
> I can
> find out when each process is finished (thread fixes that).
> Am I missing something? Is there a better way? I hate to rewrite this
> as a script since I've got a lot of object metadata that I'll have to
> regenerate with each call of the script.
Use subprocess; it supports a cwd argument to provide the given
directory as the child's working directory.
Help on class Popen in module subprocess:
| Methods defined here:
| __init__(self, args, bufsize=0, executable=None, stdin=None,
derr=None, preexec_fn=None, close_fds=False, shell=False, cwd=None,
iversal_newlines=False, startupinfo=None, creationflags=0)
| Create new Popen instance.
You want to provide the cwd argument above.
Then once you have launched all your n processes, run thru' a loop
waiting for each one to finish.
# cmds is a list of dicts providing details on what processes to run..
what it's cwd should be
runs = 
for c in cmds:
run = subprocess.Popen(cmds['cmd'], cwd = cmds['cwd'] ..... etc
# Now wait for all the processes to finish
for run in runs:
Note that if any of the processes generate lot of stdout/stderr, you
will get a deadlock in the above loop. Then you way want to go for
threads or use run.poll and do the reading of the output from your
> thanks for any suggestions,
> --Tim Arnold
More information about the Python-list