shared logs and multiple configurations
ccurvey at gmail.com
Thu Jun 8 15:17:52 CEST 2006
I've apparently tied myself up a bit using the logging package.
In my project, I have a core set of model and controller classes that
set up their logging using logging.fileConfig(). So far, so good.
But I use these core classes from a bunch of different places.
Sometimes from within a CherryPy server, sometimes batch jobs run from
a command line, sometimes from Windows services (also written in
Python). Ah, the beauty of OO design.
I would like to have a series of logs: a shared log for the core
classes (which would log messages from the core classes, regardless of
how they were invoked), a different one for CherryPy related stuff, a
third one for batch jobs run from the command line, a fourth for the
windows services. I'd alsolike to have the logs rotate automatically
using the TimedRotatingFileHandler classes. Oh, and I'd like to do as
much setup thru the configuration files and logging.fileConfig() as
So my questions are:
1) Can I share log files between processes? Log messages seem to get
written, but when it comes time to roll over a log, I generally get a
"IO operation on closed file" error. (I'm thinking that I may have to
set up a logging service if I want to do this, but I'm hoping there's a
2) When I want to use logging from within a multi-threaded server
(built using the Threading module), do I create one global logger
reference, and then have each thread refer to that instance, or do I
have each thread grab it's own reference using getLogger()?
3) It seems like you can't make multiple calls to logging.fileConfig().
If I call this twice with different ini files, it appears that any
handlers set up in the first call are silently dropped when the second
call is made. (Strangely, log messages sent to loggers set up in the
first call seem to just be silently dropped.) Am I diagnosing this
behavior properly? Is there a way
to work around it?
Platform is Python 2.4.2 on Windows.
More information about the Python-list