This is to advise that I have already implemented Python modules with the
ability to compile
C or C++ Python dynamically loadable modules
C or C++ external applications
and these are incorporated in interscript in the 'compilers' subdirectory.
The implementation is not important. What is important is the _interface_
provided. Please examine. I would appreciate any comments on the
generality of the interface.
In my opinion, these modules should be in the standard distribution,
and be hooked by 'import' so that C/C++ extension developers don't have
to provide _any_ makefiles or other installation crud. Just copy the
C file into the right place and it gets compiled automatically,
just like a .py file gets compiled to a .pyc file automatically.
Obviously, this technology cannot handle linking to arbitrary libraries
in a platform independent way -- but it will handle a large class
of C modules of the kind 'Could be done in Python, but C is faster'.
-------------------------------------------------------
John Skaller email: skaller(a)maxtal.com.au
http://www.maxtal.com.au/~skaller
phone: 61-2-96600850
snail: 10/1 Toxteth Rd, Glebe NSW 2037, Australia
distutils-sig: forum to discuss module distribution utilities for Python
The distutils sig exists to discuss the design and implementation
of a suite of module distribution utilities for Python. These utilities
will take the form of some standard modules, probably grouped in the
'distutils' package, for building, packaging, and installing third-party
modules. This includes both modules written purely in Python and
extension modules written in C/C++.
The main deliverable will be the following core modules (in the
distutils package):
build - provides code for building a module, which might include
copying .py files into a staging area, compiling and linking
.c files, processing documentation to an installable form,
etc.
dist - create a source distribution
bdist - create a 'built distribution' (the equivalent of a 'binary
distribution', except that binaries won't necessarily
be present
install - install a built library on the local machine
gen_make - generate a Makefile to do some of the above tasks
(mainly 'build', for developer convenience and efficiency)
I will tentatively put forward March 1999 as a target for a working
release to run on top of Python 1.5.2, which *might* require a
patch-and-rebuild-Python step (or might find ways to work with the
information available under 1.5). For a complete, tested, documented
suite ready to bundle with Python 1.6, let's shoot for June 1999.
Other topics of interest:
* encouraging module developers to write test suites by having
a standard place for them in module distributions
* ditto for documentation -- although this is probably the job of
the doc-sig, it would be nice to tie the two together at some point
* a standard for representing and comparing version numbers
* social engineering in general, ie. convincing module developers
to start using the system
* the possible need for tweaks to the configure/build process for
Python itself, and a place to hold configuration
information (possibly a new built-in module, 'sys.config')
* possibly rewriting the configure/build/install process for
Python 1.6 -- especially useful if the distutils are bundled
with 1.6!
* ties to Trove -- any module distribution scheme must include
a way to describe the package metadata, and there will need
to be hooks between any future Trove archive for Python and
this metadata
* recognizing SWIG-assisted extensions in addition to "ordinary"
C extensions
The starting point for discussion on the sig will be the summary of the
IPC7 Developer's Day session which got this whole thing rolling:
http://www.foretec.com/python/workshops/1998-11/dd-ward-sum.html