Newbie: splitting dictionary definition across two .py files

Karthik Gurusamy kar1107 at
Fri Mar 31 22:46:22 CEST 2006

Ben Finney wrote:
> kar1107 at writes:
> > I'm fairly new to python. I like to define a big dictionary in two
> > files and use it my main file,
> >
> > I want the definition to go into and
> That sounds like a very confusing architecture, and smells very much
> like some kind of premature optimisation. What leads you to that
> design? It's very likely a better design can be suggested to meet your
> actual requirements.

I work in a source tree of 100s of .c files. The make file builds the
various .o files from the .c files. It takes a while to finish on the
first run. When I make changes to a .c file, I need to compile to get
the .o and also quickly fix any compilation errors.  I don't want to
use the make infrastructure, as it takes a while to resolve the

So effectively I'm writing a python script as a poor man's make file.
On the first run of make, I capture the complete stdout. Then use a
script (another python one -  to grep thru' the
makefile's log to find out the exact gcc command used to get a foo.o
from its foo.c The script also captures the current working directory
-- the make is kind enough to spit out stuff like 'Entering directory
/blah/blah/path/to/binary....' It puts out the output in
(it outputs in stdout, which I redirect)

I could get results of about 0.20 second completion for a .o file, when
the make file easily takes about 20 sec; that's a 100 times speedup.

I'm interested in only a few dozen .o files that I manage.  So I run
the script to generate a dictionary of the form

target_db['foo.o'] = {
    'cmd_cwd': r'/blah/blah/path/to/binary',
    'cmd_str': r'/path/to/gcc <tons of options> -o obj-xyz/foo.o
../blah/foo.c',    #'redirect': 1,
    # I can add any other flags I may think of
    # In fact I'm planning to make the cmd_str as a list so that I can
    # run a series of commands

In my based on the target I give on command line (
foo.o), I find the dictionary entry and using popen2.Popen3, execute
the corresponding cmd_str.

Now for every new source tree I pull or when make-file changes, I want
to rerun the script to generate the new dictionary.

I found can also be used to automate other tasks - say pulling
a source tree. In general to run any command (or list of commands that
you issue from the shell prompt). These other tasks dictionary entries
are static; they don't change when makefiles options changes.  That is
the reason I want to split the dictionary contents in two files.  I
only have to change everytime make-file changes/I use a
new source tree.

I'm not really worried about optimizations at this time; just want a
cleaner solution to my problem.

> --
>  \     "If you can't annoy somebody there is little point in writing." |
>   `\                                                  -- Kingsley Amis |
> _o__)                                                                  |
> Ben Finney

More information about the Python-list mailing list