Sample setup.py for Numeric Python
Hi again -- [cc'd to Paul Dubois: you said you weren't following the distutils sig anymore, but this directly concerns NumPy and I'd like to get your input!] here's that sample setup.py for NumPy. See below for discussion (and questions!). ------------------------------------------------------------------------ #!/usr/bin/env python # Setup script example for building the Numeric extension to Python. # This does sucessfully compile all the .dlls. Nothing happens # with the .py files currently. # Move this file to the Numerical directory of the LLNL numpy # distribution and run as: # python numpysetup.py --verbose build_ext # # created 1999/08 Perry Stoll __rcsid__ = "$Id: numpysetup.py,v 1.1 1999/09/12 20:42:48 gward Exp $" from distutils.core import setup setup (name = "numerical", version = "0.01", description = "Numerical Extension to Python", url = "http://www.python.org/sigs/matrix-sig/", ext_modules = [ ( '_numpy', { 'sources' : [ 'Src/_numpymodule.c', 'Src/arrayobject.c', 'Src/ufuncobject.c' ], 'include_dirs' : ['./Include'], 'def_file' : 'Src/numpy.def' } ), ( 'multiarray', { 'sources' : ['Src/multiarraymodule.c'], 'include_dirs' : ['./Include'], 'def_file': 'Src/multiarray.def' } ), ( 'umath', { 'sources': ['Src/umathmodule.c'], 'include_dirs' : ['./Include'], 'def_file' : 'Src/umath.def' } ), ( 'fftpack', { 'sources': ['Src/fftpackmodule.c', 'Src/fftpack.c'], 'include_dirs' : ['./Include'], 'def_file' : 'Src/fftpack.def' } ), ( 'lapack_lite', { 'sources' : [ 'Src/lapack_litemodule.c', 'Src/dlapack_lite.c', 'Src/zlapack_lite.c', 'Src/blas_lite.c', 'Src/f2c_lite.c' ], 'include_dirs' : ['./Include'], 'def_file' : 'Src/lapack_lite.def' } ), ( 'ranlib', { 'sources': ['Src/ranlibmodule.c', 'Src/ranlib.c', 'Src/com.c', 'Src/linpack.c', ], 'include_dirs' : ['./Include'], 'def_file' : 'Src/ranlib.def' } ), ] ) ------------------------------------------------------------------------ First, what d'you think? Too clunky and verbose? Too much information for each extension? I kind of think so, but I'm not sure how to reduce it elegantly. Right now, the internal data structures needed to compile a module are pretty obviously exposed: is this a good thing? Or should there be some more compact form for setup.py that will be expanded later into the full glory we see above? I've already made one small step towards reducing the amount of cruft by factoring 'include_dirs' out and supplying it directly as a parameter to 'setup()'. (But that needs code not in the CVS archive yet, so I've left the sample setup.py the same for now.) The next thing I'd like to do is get that damn "def_file" out of there. To support it in MSVCCompiler, there's already an ugly hack that unnecessarily affects both the UnixCCompiler and CCompiler classes, and I want to get rid of that. (I refer to passing the 'build_info' dictionary into the compiler classes, if you're familiar with the code -- that dictionary is part of the Distutils extension-building system, and should not propagate into the more general compiler classes.) But I don't want to give these weird "def file" things standing on the order of source files, object files, libraries, etc., because they seem to me to be a bizarre artifact of one particular compiler, rather than something present in a wide range of C/C++ compilers. Based on the NumPy model, it seems like there's a not-too-kludgy way to handle this problem. Namely: if building extension "foo": if file "foo.def" found in same directory as "foo.c" add "/def:foo.def" to MSVC command line this will of course require some platform-specific code in the build_ext command class, but I figured that was coming eventually, so why put it off? ;-) To make this hack work with NumPy, one change would be necessary: rename Src/numpy.def to Src/_numpy.def to match Src/_numpy.c, which implements the _numpy module. Would this be too much to ask of NumPy? (Paul?) What about other module distributions that support MSVC++ and thus ship with "def" files? Could they be made to accomodate this scheme? Thanks for your feedback -- Greg -- Greg Ward - software developer gward@cnri.reston.va.us Corporation for National Research Initiatives 1895 Preston White Drive voice: +1-703-620-8990 Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913
Greg Ward wrote:
[sample setup.py] ext_modules = [ ( '_numpy', { 'sources' : [ 'Src/_numpymodule.c', 'Src/arrayobject.c', 'Src/ufuncobject.c' ], 'include_dirs' : ['./Include'], 'def_file' : 'Src/numpy.def' } ),
[/]
First, what d'you think? Too clunky and verbose? Too much information for each extension? I kind of think so, but I'm not sure how to reduce it elegantly. Right now, the internal data structures needed to compile a module are pretty obviously exposed: is this a good thing? Or should there be some more compact form for setup.py that will be expanded later into the full glory we see above?
I'd suggest using a class based approach for the ext_modules data,e.g. class ExtModule: def_file_template = 'Src/%s.def' def get_includedirs(self): return ['./Include'] def get_def_file(self): return self.def_file_template % self.name # etc. class _numpy(ExtModule): name = 'numpy' sources = ('_numpymodule.c','arraymodule.c',...) setup() would then be passed a list of instances (or classes which it then instantiates): setup(..., ext_modules = [_numpy(),...], ...) Advantages are greater incapsulation and more flexibility due to the fact that naming schemes used in the external module can be incoporated into the classes.
I've already made one small step towards reducing the amount of cruft by factoring 'include_dirs' out and supplying it directly as a parameter to 'setup()'. (But that needs code not in the CVS archive yet, so I've left the sample setup.py the same for now.)
The next thing I'd like to do is get that damn "def_file" out of there.
AFAIK, you don't need the def-files anymore. All you have to do is use a tag on every function you wish to export in the source code. Python already works this way and so do all my extensions (the needed macros are in the file mxh.h). -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 109 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
On Mon, 13 Sep 1999, M.-A. Lemburg wrote:
AFAIK, you don't need the def-files anymore. All you have to do is use a tag on every function you wish to export in the source code. Python already works this way and so do all my extensions (the needed macros are in the file mxh.h).
But that requires modifying the source, which is IMHO unacceptable. There is an alternative which is to specify the exports w/ a command line switch. --david
David Ascher wrote:
On Mon, 13 Sep 1999, M.-A. Lemburg wrote:
AFAIK, you don't need the def-files anymore. All you have to do is use a tag on every function you wish to export in the source code. Python already works this way and so do all my extensions (the needed macros are in the file mxh.h).
But that requires modifying the source, which is IMHO unacceptable.
Why is that unacceptable ? After all, package authors do have access to the code and know their way around ;-) Better drop the old def-file stuff and move to more modern ways of intergrating the information into the code...
There is an alternative which is to specify the exports w/ a command line switch.
...and perhaps add this as backup solution. Note that the /EXPORT:symbol you need for every symbol may well fill up the available command line argument memory (not sure how much there is on WinXX). -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 109 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
Why is that unacceptable ? After all, package authors do have access to the code and know their way around ;-) Better drop the old def-file stuff and move to more modern ways of intergrating the information into the code...
Well, what strikes me as unacceptable is: 1) it makes creating setup.py files for existing code impossible w/o modifying said code. This will make migration more difficult. 2) it introduces platform-dependencies related to compilers in the code, instead of in compiler-spefific metafiles, which seems wrong.
...and perhaps add this as backup solution. Note that the /EXPORT:symbol you need for every symbol may well fill up the available command line argument memory (not sure how much there is on WinXX).
I've never run into it, if it exists. --david
Hi, Thanks for the snapshot. RE the proposed script file format - yes, I think it is too verbose. My suggestion is to allow importing and exporting various types of makefiles (ie Setup, dsw/dsp, unix makefiles). This way, people can choose the way to build their extensions. Give me a couple of weeks and I'll port the code for this to the current setup format (assuming it doesn't change). The other suggestion is to add *aquisition* to the system. This is a concept invented / named by the DigiCool Zope team. I have found this is a fantasic way to handle platform dependances. I have used this in my Zope build system to abstract out the def problem [Greg, do you still want the Zope build system? I don't have full access to the net so I cannot use CVS but I can email it to you]. Basically I use a make file style format where platform deps are handled by loadable makefiles which override or modify rules. For example, MyExtension: SomeDirectory$/Setup $(PLATFORM)$/make.dsw \ MyExtensionMetaRule file.c file1.c MyExtension.pyd Here, MyExtension imports from a Setup file, a MSVC workspace file and builds the resulting MetaRule (which is modified by the previous 2 imports), the two c sources and finally generates the pyd. As each c file is compiled the resulting object files are added to the global variable OBJS. Then, if the generation of the pyd uses a compiler needing def files (or any others...) it is generated as a temp file, used and then deleted. The OBJS variable is then reset. This is what I mean by aquisition. A few bits 'n pieces to add... We need to accomedate people using diverse platforms. This implies using verbose os.path.join for all paths OR by supplying a platform to generic converter. This converter converts platform file specs to $/ format. The $/ is replaced with os.path.sep upon load hence autoconverting to the right format. Other metachars include the $(..) format for variables. The above system has been tested on a subset of Mark Hammond's win32 extensions, the Python Imaging Extensions (including numeric py i think) and other misc stuff. The other big thing is that I have converted the entire Python build system into this format so once I write an exporter you can build python cross-platform using dist-utils. I have removed the need for SED 'n friends and merged freeze and Gordon's freezer so that if you add pyc/pyo's to the source line it will either freeze it into the exe or the currently active MPZ file. The only thing missing now is a cross-platform point 'n click extension installer! I think WxPython should do the job ;) Cheers, Anthony Pfrunder Computer Systems Engineering University of Queensland
participants (4)
-
Anthony Pfrunder
-
David Ascher
-
Greg Ward
-
M.-A. Lemburg