Build bugs in Python 2.2.1?
jonathan at onegoodidea.com
Sat Aug 10 15:38:02 CEST 2002
On 10/8/2002 12:11, in article j4wuqzarzu.fsf at informatik.hu-berlin.de,
"Martin v. Löwis" <loewis at informatik.hu-berlin.de> wrote:
> Jonathan Hogg <jonathan at onegoodidea.com> writes:
>> Being intentional doesn't necessarily make it correct. Adding /usr/local to
>> the library/include path will only enable modules which happen to have
>> dependencies in that place, and in the worst case may result in linking to
>> entirely the wrong libraries.
> This is all correct. However, it so happens that this is a common
> case: people do have libraries in /usr/local, as this is the autoconf
> default prefix.
Which is why it should search in the install prefix for libraries. This is
the standard procedure with most autoconf-style packages. If someone uses
the default install then everything will be found successfully. But you have
the added benefit that if you install in /opt by default instead, then
everything will also be found.
>> The module build process should honour the same conventions that the core
>> build process does (and indeed all other autoconf-style packages do).
> Why is that?
Consistency mainly. Pretty much all the packages I manage I can setup
entirely from configure. This means I can use the same procedure across them
all by just varying the configure options. If I have to modify a source file
in order to configure a package then I have to record that patch. When a new
version of Python comes out, the patch will likely no longer work and I will
have to reconstruct what I did last time and create a new patch.
Banging everything into /usr/local might be fine if all you're managing is
the machine you sit at, but when you manage a whole site with multiple
architectures, you need something a little more complex.
I frequently have to deal with multiple versions of installed libraries,
such as SSL, zlib, etc. To do this, it is pretty important that I can
specify which version to use explicitly and have that recorded into the
executable via an appropriate rpath. With most of the packages I use, this
can be rather conveniently done with something like:
% ./configure --with-ssl=/path/to/ssl/install
All the libraries I install end up in different, versioned, deploy
directories. This means that the configure options become quite complex.
This isn't a problem though, because they are managed for me automagically
by my build manager (Arusha). However, generating a custom patch file for
Modules/Setup is significantly harder to do.
The Arusha Project <http://ark.sourceforge.net/>
>> [To be honest, I think the automagic setup.py is a mistake and the
>> functionality from that should be extracted out into configure with
>> configure then generating a correct Modules/Setup file.]
> I think similarly: setup.py supports most users to their
> satisfaction. It cannot support all use cases; people who have needs
> beyond what setup.py can do need to edif Modules/Setup.
> In changing these things, one has to be careful not to displease the
> many in order to satisfy the few.
The thing is, if the many simply type './configure && make install', then
pleasing them is fairly easy. autoconf already supplies all the necessary
machinery to search for installed libraries, check for the existence of
particular system calls or header files, etc. It makes more sense to utilise
that machinery than to poorly replicate it in setup.py.
It's not just that I want my life to be made easier at the expense of
others, it's that I genuinely believe that the current process is
over-complicated (requires editing a config file), inconsistent (doesn't
pick up flags or searchpaths from the core), and contains unnecessary
duplication (searching for libraries and per-architecture behaviour).
I'd be happy to aid in helping with using autoconf to manage the entire
build process if there was a good chance of it being accepted into the
More information about the Python-list