[BangPypers] How to create Debug and Release code in Python

Jeff Rush jeff at taupro.com
Mon Jun 15 10:15:53 CEST 2009


Vishal wrote:
> 
> *Is there a way to create a conditionally compilable Python script
> ?* Some facility that would prevent from code getting compiled into
> ".pyc"....something like the old #ifdef #endif preprocessor
> directives....is there a Python preprocessory available :)

The way to create conditionally compiled Python is to use Python. ;-)
You can place if-then statements in places you might not expect if you
are coming from something like C/C++.

if PRODUCTION:
    def function():
        pass
else:
    def function():
        pass

The if-then is executed only once at the time the module is imported,
minimizing the overhead cost.  If you place the if-then -inside- a
function, then of course you pay the price each time the function is
called, which you expressed a wish to avoid.

You can use this also with class methods:

class SomeClass:

    if PRODUCTION:
        def __init__(self):
            pass
    else:
        def __init__(self):
            pass

This case is also only performed at module import time, not each time
the class is used.

For some kinds of changes it can be useful to apply decorators
conditionally, either method decorators:

class SomeClass:
    def methodA(self):
        pass
    def methodB(self):
        pass
    if not PRODUCTION:
        methodA = DebuggingWrapper(methodA)
        methodB = DebuggingWrapper(methodB)

Or class decorators:

class SomeClass:
    def methodA(self):
        pass
    def methodB(self):
        pass
if not PRODUCTION:
    SomeClass = DebuggingWrapper(SomeClass)

An example of a wrapper for a class could be:

import inspect
def memoize_get_methods(klass):
    for _, method in inspect.getmembers(klass, inspect.ismethod):
        if method.__name__.startswith('get'):
            setattr(klass, method.__name__, memoized_method(method))

Such a wrapper could manipulate or register selected methods, functions
or attributes, one time at import time.  An example of this is a
memoizer, where if you call a method with the same arguments it
remembers what it returned last time and just returns that instead of
re-calculating the result each time.

def memoized_method(method):
    def wrapped(self, *args, **kw):
        key = tuple(args) + tuple(kw)
        try:
            return method.cache[key]
        except KeyError:
            result = method(self, *args, **kw)
            method.cache[key] = result
            return result
    method.im_func.cache = {}
    return wrapped

These are just some ideas of what is possible since I'm not familiar
with what kind of changes your developers experiment with, whether
high-level (which these approaches are good for) or low-level (where
#ifdef works better).

I gave a talk at PyCon 2009 about namespaces and code blocks that use
diagrams to illustrate the difference in Python between import time and
run time.  You can find the video, slides and tools used at:

    http://us.pycon.org/2009/conference/schedule/event/7/

> if thats doable I can ask them to make all experimental modifications
> within the conditional directives. I could still ask them to do that
> using simple if's and bool parameters, however, putting too many ifs
> might degrade the performance of these scripts. ("the lesser the
> branches the better it is for processor performance...more so for super
> scalar processors")

The "less branches the better" concept is not really relevant for an
interpreted language like Python.  The interpreter is doing a lot of
subroutine calling and branching for you as it executes each bytecode in
software.  Less branching is only relevant when the code is executed in
hardware.

You will have a performance overhead in Python if your if-then clauses
are sprinkled through the bodies of functions/methods, just because they
are executed each time control passes through that point.  It is best
though from a maintainability viewpoint not to do that.  There are many
books about programming, some of the C/C++ programming philosophy books,
that talk about how adding such small debug-specific logic pieces make
it hard to remove them later without introducing bugs.


> We cannot use the, often unknown, __debug__ flag, because our users
> would like to simply double-click on a python file, which means to use
> the -O or -OO options (which sets __debug__ to False) we'd have to make
> windows python command line have these, and then double clicking would
> be of no use if you want the run stuff inside the __debug__.

I'm confused - the "production" users want to click on an icon to start
the program but then they don't need the "debug" flag so that isn't a
problem.  Your developers however run from the command-line so they
-can- specify a debug or other flag at the time they invoke it.  Each
are happy so what is the problem with using the __debug__ flag again?

-Jeff


More information about the BangPypers mailing list