[Distutils] High level options interface

Martijn Faassen M.Faassen@vet.uu.nl
Tue, 22 Dec 1998 19:00:00 +0100

John Skaller wrote:
> At 18:57 21/12/98 +0100, Martijn Faassen wrote:
> >class extension_compiler:
> >    def compile_C(filename):
> >        pass
> >    def compile_C++(filename):
> >        pass
> There's no need for a base class, just a specification.

Not much of a difference here. An abstract base class has the advantage
that you can do:

   def compile_ObscureLanguage(filename):
       print "Sorry, this language is not supported on this platform."

or something like that. But I realize below you modelled the entire
thing differently anyhow.

> Please examine the following interface and see
> a gcc implementation below that. At least criticize
> the interface if you don't like it.
[actual code snipped]

I've looked through the interface, thought about it (later on in my
post) and I have some questions (also later in my post). I notice you
took the module/application to be the central object that we're doing
things to (compiling, linking, etc). My previous interface was to the

I'm pondering the consequences of this difference right now. Hmm. I'm
just trying to write up some code that uses the two abstractions.

Basic scenario:

We have code written. Some of it is in Python, some in C, and some in
C++. It all needs to be compiled and linked together, on various

The compiler centric abstraction would be used somewhat like this:
   # platform dependent initialisation code
   import platforms 
   # perhaps 'platform' is instead a module like os, already
   # for this platform (what about cross-compilation, though?)
   platform = platforms.NormalLinux2Platform() 
   # platform independent code
   c_compiler = platform.get_c_compiler()
   cpp_compiler = platform.get_cpp_compiler()
   for filename in c_filenames:
   for filename in cpp_filenames:
   # note that the compiler objects know that we compile to '.o' files
   # on this platform (not for instance to '.obj')
   # ? what if we need to link c and cpp objects together?
   # something needs to be done to get an actual module name
The module/application centric abstraction:
   # platform dependent initialisation
   import compilers.c      # already configured for gcc on this platform
   import compilers.cpp    # already configured for g++ on this platform
   some_config = "--overdrive" # extra flags (in this case nonsense)

   # platform independent code

   module = compilers.c.python_module(some_config)
   # ignoring extra compilation flags for now
   objectnames = []
   for filename in filenames:
   # alternatively something like:
   # objectnames = map(module.compile, filenames)

   # link it together producing module filename
   module_filename = module.link(module_name, objectnames)

Hmm. Am I using this right?

* What if you have to link C, C++, other languages together? (this is an
issue with both approaches)

* Why is the module name only provided at linking in the module centric
interface? Why not initialize the python_module object with the module

* There's some passing of filenames around, more in the module centric
interface. Is this necessary? Are we losing abstraction here or is it
Passing around base names seems to be enough (or just the .c names, and
stripping away .c each time you need to construct the .o name, or
whatever you need).

* In the module centric abstraction, the 'compile' method is rather
disconnected from the rest of the module. I can see that using the
'link' functionality we produce a nice shared library in the end, which
is what we want. But 'compile' doesn't 'feel' like it belongs to module.

* I'm not sure if the extra passing of compiler options around
(especially to 'compile' and to 'link' makes the thing more flexible, or
alternatively more difficult to maintain platform independently.

Perhaps a combination of the two approaches would be best. Modules
should ideally know how to compile themselves, using simple compiler
abstractions. As long as we can keep this simple enough.

So far my musings. What do you all think?