I have a patch for distutils-20000204.tar.gz.
patch for command/build_ext.py:
1. use correct name 'libs' instead of 'libraries' in build_ext options
2. adjust a String argument in BuildExt.libs to a list
> # because fancy_getopts assigns strings
> if type(self.libs)==StringType:
> self.libs = [self.libs]
< libraries = build_info.get ('libraries')
> libraries = build_info.get ('libs')
The second thing is a suggestion.
Bug number 2 occurs on several more places in the distutils code.
Everytime the option type is list, the fancy_getopt method stores a
So there are two restrictions for options:
1) An option is allowed only once and
2) an option value type has to be a string.
Both restrictions are not necessary. What I want to have is that if I
supply mulitple --libs options, each option value gets added to a list.
In the current implementation, subsequent option calls override the old
value. If I call
# ./setup.py build_ext --libs=ssl --libs=crypto
the ssl library value gets overridden.
Here is the code snippet from fancy_getopt.py thats responsible:
attr = attr_name[opt]
setattr (object, attr, val)
Solution: instead of setattr, you should call the "set_option" function
of the object with "val" as argument.
So in the example above, this is
attr = attr_name[opt]
Lets look now at the set_option function. The initial implementation
is given in class Command:
def set_option (self, option, value):
"""Set the value of a single option for this command. Raise
DistutilsOptionError if 'option' is not known."""
if not hasattr (self, option):
raise DistutilsOptionError, \
"command '%s': no such option '%s'" % \
if value is not None:
setattr (self, option, value)
This is okay if I am happy with the above mentioned restrictions. To
extend this function, you override it in a subclass, for example in the
# somewhere in the init function is this statement:
self.list_value_options = ["libs", "include_dirs"] # and some more
# now the new set_option
def set_option(self, option, value):
if option in self.list_value_options:
Command.set_option(self, option, value)
Of course the above changes need some further refinement: you have to
initialize the list_value_options with the empty list, but then a
check "self.libs is None" will fail. Use "not self.libs" instead.
Then you have to ensure that every object we supply in fancy_getopt has
this set_option method.
And so on.
What do you think of my suggestion?
The package meta-info patch I sent out a while ago includes some immature
facilities for tracking project dependencies and project compatibilities,
two ideas that I'm very interested in which I'd like to seed some discussion
on in the hopes that they will be integrated in some form or another into
Definition of terms:
_Dependencies_ are versions of external packages that a given package is
A version of A a package is _Compatible_ with version B of the same package
iff A provides the same _Public Interface_ as B - A may extend B's
interface, but it may not omit any portion of it. The _Public Interface_ of
a package includes all methods and attributes that are available and not
designated as private, either explicitly in the documentation of the
package, or implicitly through the use of standard naming conventions (i.e.
names preceeded by an underscore). It also includes the documented behavior
of methods, particularly with respect to parameter types and return values.
This subject is worthy of a much longer document, but I want to keep this
_Compatible Versions_ are versions of a package that are Compatible with a
given version of the package.
I define the _Compatibility Set_ as the inverse form of Compatible Versions:
it is the set of all versions that a package is compatible with. If p-v1 is
compatible with p-v0, p-v0 is in p-v1's compatibility set, and p-v1 is in
p-v0's compatible versions.
If we track the dependencies and the compatibility sets of each package, it
should be possible to provide a fairly flexible system to determine whether or
not all of the required dependencies for a particular package are present, and
at least advise the user if they are not, perhaps allowing them to download
the required packages at the time of installation.
The problem is, a dependent may be compatible with many versions of a
particular dependency. Rather than enumerating every compatibility, it
would seem desirable to define a regular-expression like syntax for
describing sets of versions, for example:
1.20-? matches all versions with a major number of 1 and a minor
number > 20
1* matches all versions with a major number of 1
1.3-4* matches 1.3.6, 1.4.1, and 1.4, but not 1.5, or 1.40
Here's an example of how such a system might function in a particular case:
Package A version 1.2 is installed on a system and has a compatibility set
= [ '1.0.5-?', '1.1*' ] (i.e. 1.2 is compatible with all 1.0.n versions after
1.0.5, and all 1.1 versions).
Package B version 2.0 has a dependency on package A version '1.0.9'.
In this case, the dependency is satisfied by the installed version.
This model is pretty straight-forward, however, it gets worse. Sometimes a
compatibility break in a service package doesn't affect a particular
dependent: it doesn't matter to my package if you remove method foo() if I
don't use method foo(). For this reason, each dependency should be described
as a /set/ of compatible versions of a package, rather than a single
Let's consider another example:
Package A versions 1.0.8 is installed on a system and has a compatibility
set = [ '1.0.5-7' ]
Package A version 1.0.4 (which is not installed) has a compatibility set = [
Package B version 2.0 (which is not affected by the interface change between
1.0.4 and 1.0.5) has a dependency on package A versions 1.0.1 _or_ 1.0.5.
Again, the dependency is satisfied.
- given the variations among the ways people version software, is my notion
of "version wildcarding" conceptually sound?
- is it desireable (and possible) for a package to be able to specify
wildcard dependency versions? e.g. should B be able to specify its
dependency on A as "1.0.1-?"?
- is it desireable for a package to be able to explicitly ignore the
compatibility rules of its dependencies (perhaps using wild-card version
numbers to describe _exactly_ which versions of another package it can
michaelMuller = mmuller(a)enduden.com | http://www.cloud9.net/~proteus
There is no concept that is more demeaning to the human spirit than the
notion that our freedom must be limited in the interests of our own
I was just looking at sysconfig.py again, and I'm not sure I really
like the interfaces to get_python_*(). In particular, the
plat_specific parameter seems strange, especially since it will
produce the same result in most cases.
Perhaps the Right Way To Do It(TM) would be to drop the parameter
and return a list containing all the dirs that apply. To locate
config.h, for instance, search each location in the list.
I'd also rename get_python_inc() to get_python_include(), but
perhaps that's just me. ;)
Fred L. Drake, Jr. <fdrake at acm.org>
Corporation for National Research Initiatives
Hi all --
a question for the Windows crowd: in the current distutils MSVCCompiler
class, the 'link_static_lib()' method takes arguments for libraries and
library search directories -- this implies that, when building static
libraries under Windows, the library-building tool needs to be told
which other libraries this one depends on, and where to find them.
This is different from the Unix world, where a static library just has
dangling references that are resolved at link-time. This also differs
from the "canonical compiler interface" defined by the CCompiler class;
so, if it is needed on Windows, I'll have to change this canonical
So: is this true? Does a static library under Windows need to be built
with library and library search directories? Or is this an unnecessary
appendage that can be jettisoned?
(Yep, I'm on a simplify-the-code vendetta this weekend...)
Greg Ward - just another Python hacker gward(a)python.net
Cheops' Law: Nothing *ever* gets built on schedule or within budget.
Hi all --
as I mentioned last night, I've been hacking on the build_* commands
lately, with the goal of putting all temporary compiler by-products
under build/temp.<plat>, and allowing "in-place" building of extensions.
Everything is done now, and for the most part It Works For Me (TM). (I
just remembered that I haven't tested the new --inplace option on the
build_ext command -- arg!) Windows support isn't there yet -- I have to
hack on the MSVCCompiler class a bit, and I think I will take advantage
of this to reduce overlap and increase uniformity between UnixCCompiler
and MSVCCompiler -- ie. it'll take parallel printouts and a couple of
hours, so I'll save it for tomorrow.
Anyways, if you're a on a Unix-y platform, please check out the latest
CVS and try building some distributions with extensions and/or C
libraries. NumPy and PIL are as usual the canonical examples (setup
script free with the Distutils!), but if you're feeling adventurous you
might try distutilizing some other distribution.
Oh yeah, the setup.py currently distributed with NumPy won't work with
the CVS version of Distutils; you'll have to replace it with
examples/numpy_setup.py. It was the renaming thing in command classes
that did it, y'know.
Greg Ward - Unix weenie gward(a)python.net
Never underestimate the power of human stupidity.
Hi all --
anyone keeping up with the Distutils CVS archive may have noticed these
changes, but I thought I should mention them just so everyone's up to
First, the "install" command is in a state of disrepair. The
most-recent checkin was an incomplete, unfinished, aborted idea -- only
checked it in for posterity. Don't even bother trying to use it.
Second, the "build" commands have been undergoing some upheaval. The
* put compiler turds (object files, in particular, but also the stuff
that MSVC leaves behind on linking a DLL) in a temp directory --
eg. build/temp.linux-i586 on my machine.
* make the "is this a pure Python distribution or a distribution
with extensions?" decision at build time rather than install time
-- the effect of this is that *everything* will build to either
build/lib or build/lib.<plat>
* support building extensions right into the source tree, so that
developers hacking away on an extension don't have to screw around
with "building" Python modules or mucking with PYTHONPATH (or
I'm still working on the "build_lib" command, which I think I'm going to
rename to "build_clib", since that's what it does. (The terminology I'm
gravitating towards is that "lib" means "Python library directory",
"purelib" means "pure Python library directory", and "platlib" means
"platform-specific Python library directory". Thus it's misleading to
call the command that builds C/C++ libraries "build_lib", since it has
nothing to do with Python libraries.)
But the basic Python building stuff -- pure Python and extensions --
seems to be working. Check it out and give it a spin if you're feeling
brave. Maybe tomorrow at work I'll actually test it on a platform other
than Linux. ;-)
Greg Ward - Unix bigot gward(a)python.net
"When I hear the word `culture', I reach for my gun." --Goebbels
"When I hear the word `Microsoft', *I* reach for *my* gun." --me