Logging: how to suppress default output when adding handlers?

Vinay Sajip vinay_sajip at yahoo.co.uk
Tue Jun 5 18:13:15 EDT 2007


On Jun 5, 8:38 pm, Chris Shenton <c... at shenton.org> wrote:
> Yeah, I think this is the cause.  Unfortunately I'm using a couple
> dozen files and a bunch more libraries and if they're doing a logging.debug() or whatnot they're creating this.

I wouldn't have thought that well-written third party libraries (like
SQLAlchemy) would log to the root logger. It's by logging to an
application or library-specific logger that people can see where the
events are being generated - logging to the root logger means a lot of
useful information is lost.

> Do you have any ideas how I can trace where the first call is made?
> This seems a newbie question but if I have a bunch of other files
> which do stuff like "from sqlalchemy import *" they might be invoking
> a logging call so I'm not sure how to chase these down.

I presume you are starting the ball rolling using one or more scripts.
As long as you are not making logging calls at import time, then you
can configure logging in your main script, e.g.

if __name__ == "__main__":
  logging.basicConfigure(...)

>
> I'm dreading having to be so verbose with my (copious) loggers, which
> is why I was curious if there was a way to nuke any auto-created
> ones.  I thought callinglogging.shutdown() before configuring my
> loggers might do this but it didn't.
>

No need to be particularly verbose. One convention is for each module
to:

# for module foo/bar/baz.py

# Near the top of the module...
import logging

logger = logging.getLogger("foo.bar.baz")

# Then, in the rest of the module...

logger.debug("Test debug output") # No more verbose than
logging.debug("Test debug output")

Even if you have a dozen files, it should be easy to do search/replace
across them without too much trouble.

Regards,

Vinay Sajip




More information about the Python-list mailing list