python logging multiple processes to one file (via socket server)

Gelonida N gelonida at gmail.com
Thu Oct 27 06:09:29 EDT 2011


Hi,

I have a rather 'simple' problem.
Logging from multiple processes to the same file AND be sure, that no
log message is lost,




1.) Log multiple processes to one file:
------------------------------------------

I have a python program, which I want to log, but which forks several times.

Due to the forking logging to files with the default logging.FileHandler
seems out of question.


It seems, that I could use a SocketHandler, which collects data from all
different processes and logs then to one file.


Does anybody have a working example.


2.) Ensure, that no log message is lost.
------------------------------------------
If I understood the doc of the SocketHandler, then
it will drop messages if the socket handler is not available.


However for my current activity I would prefer, that it aborts if it
cannot  connect to the socket and that it blocks if the log server
doesn't handle the sent data fast enough.

Is this possible.

Thanlks a lot in advance.


What I found so far:
http://docs.python.org/howto/logging-cookbook.html#logging-to-a-single-file-from-multiple-processes

It states:
"The following section documents this approach in more detail and
includes a working socket receiver which can be used as a starting point
for you to adapt in your own applications."

Somehow I have a mental block though and fail to see the 'following
section'.


I also found http://code.google.com/p/python-loggingserver/ and ran
first tests.

However it seems, that this server stops logging my application after
about 128 log entries. (the number varies and is not necessarily exactly
128), whereas the console loggier continues logging.

I'm not really show why and would prefer a simpler example first.


Thanks in advance for any code example, idea, link, comment.





More information about the Python-list mailing list