reading multiple files

Mag Gam magawake at
Fri Sep 10 04:35:06 CEST 2010

Thanks for your response.

I was going by this thread,  makes
you wonder even if its possible.

I will try your first solution by doing mkfifo on the files.

On Thu, Sep 9, 2010 at 9:19 PM, Alain Ketterlin
<alain at> wrote:
> Mag Gam <magawake at> writes:
>> I have 3 files which are constantly being updated therefore I use tail
>> -f /var/log/file1, tail -f /var/log/file2, and tail -f /var/log/file3
>> For 1 file I am able to manage by
>> tail -f /var/log/file1 | python
>> looks like this:
>> f=sys.stdin
>> for line in f:
>>   print line
>> But how can I get data from /var/log/file2 and /var/log/file3 ?
> Use shell tricks, e.g., with bash:
> yourpythonprog <(tail -f .../file1) <(tail -f .../file2) <(...)
> and let your prog open its three parameters like regular files (they are
> fifos actually). If your shell doesn't support <(...), create the fifos
> and redirect tail output before launching your prog.
> If you want "purer" python, launch the three "tail -f" with subprocess,
> and use the select module to get input (you didn't explain the logic you
> will follow to track three files---you may not need select if you expect
> one line from each file before waiting for the next line of any).
>> I prefer a native python way instead of doing tail -f
> Emulating tail will require a lot of stat/seeks, and finding lines will
> require an additional level of complexity.
> Also, tail -f has a cost [*]. The only way to avoid it is to use
> inotify, which seems to have a python interface, available at
> (I've never used it). Again, emulating
> tail -f with inotify is significant work.
> -- Alain.
> [*] Paul Rubin is one of the authors, I think he reads this group.
> --

More information about the Python-list mailing list