Frank McIngvale <frankm(a)hiwaay.net> writes:
> Hi, I was wondering if there is a portable
> way to determine if a C compiler is available for
> compiling extensions. I'm working on a package
> that has some extensions that aren't strictly
> required, and I'd like to turn on/off building
> them based on whether a C compiler is available.
> (I know how to do the turn on/off thing, I just
> need to know how to check for a C compiler.)
> Any hints?
Try it and catch an exception? I don't think there's a mechanism for
asking (could be wrong though).
There are 'infinite' number of developed artifacts and one cannot
develop appreciation for them all. It would be all right to not
understand something, but it would be imbecilic to put judgements
on things one don't understand. -- Xah, comp.lang.lisp
Hi, I was wondering if there is a portable
way to determine if a C compiler is available for
compiling extensions. I'm working on a package
that has some extensions that aren't strictly
required, and I'd like to turn on/off building
them based on whether a C compiler is available.
(I know how to do the turn on/off thing, I just
need to know how to check for a C compiler.)
Here's the short form of my question:
How do people generally keep their Python modules and any configuration data
for those Python modules separate?
I'm looking for both conventions (e.g., config goes in /etc/foo) and
mechanics (e.g., use distutils with options x, y, and z).
The longer form of the question is a bit more convoluted, so for those with
patience, read on. ;-)
I'm transitioning from Windows to Linux. So part of my question is about
the conventions for separating software from data on Linux. I assume it's
self-explanatory why you'd want to separate software from data. In case
it's not, here's why I think it's important to separate: You don't really
have to backup software (although you can if you want), but you do have to
backup data (it's not like you can re-download it). So, if push comes to
shove, and you have limited resources/time, it's nice to be able to focus on
the important stuff and only back that up.
There's always a gray line between software and data that I label
configuration data. In the Linux world, I guess that'd constitute anything
you do to install any piece of software beyond the equivalent of configure;
make; make install. If you specify any configuration options, create daemon
accounts, etc.--ideally you'd do that with a script and you'd store that
script somewhere along with your data, so that you not only knew what you
did, but you could easily re-do it.
I've looked at distutils and it's not immediately obvious that it was
designed to solve the distribution problem in the particular what I want to
try to solve it, although I suppose I may be able to extend distutils with
special commands to do something like this?
I want a way to say something like this:
Install my module here:
By default, place configuration data here:
But if the installer specifies different configuration options when running
setup.py, then use those values when generating mypackage.conf from
Of course, the person installing the package should be able to override
where configuration data is stored:
$ python setup.py --configdir=/var/foo
If such an option existed, its effect might be to generate a configuration
file in the root of the package's module distribution folder:
containing something readable by ConfigParser:
So that at runtime, the module first looks to the configuration information
in its package distribution to determine where it needs to look for the
distribution-specific configuration information? Shwew.
Has anybody tackled this problem or do most folks just punt on this issue
and assume they can plop configuration information alongside their
distributed modules without providing any easy way for the person who
installs it to override that?
-----BEGIN PGP SIGNED MESSAGE-----
I have a problem that I'm not able to solve on my own. I'm using
python2.2 that comes with Mandrake Linux distribution 8.2.
I have a tree of python sources that I'd like to package in more or
less the same tree, which would be installed in site-packages under
subdirectories pyortal and pyortal-modules.
If I understand python documentation correctly, if I want those modules
to be easily accessible for everyone, I need to have .pth file in
site-packages that points to all subdirectories (and right, if I create
it manually, it works). The problem I have is that if I run bdist_rpm,
I can see a message that is telling me that .pth is being build, but
the file is missing in the created rpm.
If I build with bdist only, it creates and includes pyortal.pth (but it
only lists pyortal directory in it without any subdirectory). So, what
am I doing wrong?
While I'm at it...is there an easy way to include normal text files in
module distribution? Again, if I'm not mistaken, the current way to
achieve this is to list every one of them in MANIFEST.in.
Thank you for your help. Kind regards,
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.0.6 (GNU/Linux)
Comment: For info see http://www.gnupg.org
-----END PGP SIGNATURE-----
From: Marko Samastur [mailto:firstname.lastname@example.org]
> > Why .pth files ? Why not simply *copy* the code to site-packages ?
> Mainly, but not only, because of estetic reasons. My
> project already has 26 .py files and this number will only rise.
> Dumping them all in one directory (and other authors doing
> the same), would just make that directory look like a trashcan
> and could also result in filename conflicts (which can be
> otherwise avoided if certain modules in subdirectories are
> never directly imported).
So why not make your stuff a proper package? That's the correct way of
handling this. If you don't want your users to have to cope with multi-level
names, you can always do some magic in __init__.py. But frankly, multi-level
names shouldn't be seen as a problem to be overcome - they are a
*convenience* to avoid name clashes, and should be used for that purpose.
Currently, scripts go somewhere in the "Scripts" directory (e.g. on
Windows, something like C:\Python22\Scripts, on Unix somewhere in the
If people specify scripts that live in subdirectories in their source
tree, using current distutils, those then get placed in subdirectories
of the [bin|Scripts] directory.
To my mind, this is somewhat wrong, since I expect scripts (those things
which have a #! line, an executable bit set, etc.) to all be placed into
a single directory. It means that I don't need to muck w/ my PATH if I
install a package that has a script.
Is my mind right or is my mind wrong? I'll help fix distutils if I'm
right, and I'll fix ActivePython's current PPM behavior if I'm wrong. =)
On Wed, 27 Mar 2002, A.M. Kuchling wrote:
> A package database is a necessary prequisite for managing the Python
> packages installed on a system. PEP 262 lists the requirements for
> such a database and specifies a storage format for it.
> I'd like to get this into Python 2.3, hopefully with a
> still-to-be-specified package management tool. Assuming no one points
> out some requirement or use case missing from this draft of the PEP,
> my next step will be to write a proposed interface, post that draft,
> and then implement the PEP and integrate it with the Distutils.
> Comments can be posted to comp.lang.python or to the Distutils SIG.
Here is an item for the wishlist:
Installing a new package and recording its name and file-list in a
database is one thing. RPM does that. Easily maintaining a package-based
system is another. APT+DPKG does that thanks to all the features it has
over RPM (downloading the list of available packages from a mirror of the
unified repository, checking integrity, calculating dependencies, etc.),
but also thanks to the work of hundreds of debian developers that take
care of all the dependencies between packages and upload their packages to
a central coherent repository.
I would be cool to have the equivalent for Python. We more or less already
have that with Debian, for Python extension packages are packaged as
Debian packages and can be upgraded on one debian host with a single ATP
To my knowledge, APT has nothing DPKG-specific. If what we are after is to
let people manage and upgrade their installed python packages. What about
replacing DPKG with its Python couterpart and let APT handle the rest of
the trouble? Is it really needed to fully implement yet another packaging
Or is the proposal just about a DPKG-replacement for Python packages and
did the mention of APT at the beginning of the PEP lead me to a
My last question: why should we have package management system "internal"
to a Python installation. Isn't it the role of the distro to handle
packages and should't we focus on helping out the existing distro tools
deal with Python extensions instead?
My 2 Eurocents.
http://www.logilab.com - "Mais où est donc Ornicar ?" - LOGILAB, Paris (France)
[CC to distutils-sig. The context is: distutils doesn't work
with source distributions on windows]
Here is a patch which allows distutils on windows to work
with the source distribution. I've tested it with Numeric,
but nothing else so far:
RCS file: /cvsroot/python/python/dist/src/Lib/distutils/command/build_ext.py,v
retrieving revision 1.79
diff -c -r1.79 build_ext.py
*** build_ext.py 31 Jan 2002 18:56:00 -0000 1.79
--- build_ext.py 8 Apr 2002 08:09:33 -0000
*** 167,172 ****
--- 167,176 ----
self.build_temp = os.path.join(self.build_temp, "Release")
+ # Append the source distribution include and library directories
+ self.include_dirs.append(os.path.join(sys.exec_prefix, 'PC'))
+ self.library_dirs.append(os.path.join(sys.exec_prefix, 'PCBuild'))
# OS/2 (EMX) doesn't support Debug vs Release builds, but has the
# import libraries in its "Config" subdirectory
if os.name == 'os2':
And here is one which is also very useful for debug builds: it feeds
absolute path names to MSVC, which has the advantage that the MSVC debugger
finds the debugged files automatically, without asking the user to find them:
RCS file: /cvsroot/python/python/dist/src/Lib/distutils/msvccompiler.py,v
retrieving revision 1.45
diff -c -r1.45 msvccompiler.py
*** msvccompiler.py 8 Feb 2002 14:41:31 -0000 1.45
--- msvccompiler.py 8 Apr 2002 08:12:47 -0000
*** 309,314 ****
--- 309,316 ----
self.mkpath (os.path.dirname (obj))
+ if debug:
+ src = os.path.abspath(src)
if ext in self._c_extensions:
input_opt = "/Tc" + src
elif ext in self._cpp_extensions:
IMO, both changes look innocent enough so that I could check
them into CVS.