Bundling numpy/scipy with applications
![](https://secure.gravatar.com/avatar/a5885e2535fa08c0a187b34127562335.jpg?s=120&d=mm&r=g)
I'm creating a Python application that is internal to my organization, but will be installed on both Linux and Mac OS X machines. The application depends heavily on a number of "non-pure" modules (those that have C/C++/FORTRAN components), like numpy, scipy, gdal, etc. What is the most "pythonic" way to bundle these types of modules inside my application? I've investigated dist_utils and setuptools, and I don't see an easy way with those tools to include build instructions for packages built with autconf tools. Is my only recourse to write a custom install script that calls the "configure;make;make install" from a shell? --Mike ------------------------------------------------------ Michael Hearne mhearne@usgs.gov (303) 273-8620 USGS National Earthquake Information Center 1711 Illinois St. Golden CO 80401 Senior Software Engineer Synergetics, Inc. ------------------------------------------------------
![](https://secure.gravatar.com/avatar/9820b5956634e5bbad7f4ed91a232822.jpg?s=120&d=mm&r=g)
Michael Hearne wrote:
I'm creating a Python application that is internal to my organization, but will be installed on both Linux and Mac OS X machines. The application depends heavily on a number of "non-pure" modules (those that have C/C++/FORTRAN components), like numpy, scipy, gdal, etc.
What is the most "pythonic" way to bundle these types of modules inside my application?
I've investigated dist_utils and setuptools, and I don't see an easy way with those tools to include build instructions for packages built with autconf tools.
Is my only recourse to write a custom install script that calls the "configure;make;make install" from a shell?
You have two problems: building and packaging. For packaging, autoconf will not help you much outside providing a coherence between packages, I think; distutils/setuptools have packaging tools, but I don't know how good they are. On Linux, the packaging system depends on the distribution: since it is only internal, the problem is severely simplified, since you are likely to target only one format (rpm, deb, etc...). On Mac OS X, you have also tools to package .pkg and .mpkg (which are a set of .pkg), available freely on apple dev website; I have not used them much, but they look simple and easy to use for simple command line packages. You could hack something in distutils to call configure and so on, but I don't think it will be pleasant. Building packages using distutils/setuptools is easy, and do not need much configuration. So I think it is more natural to use a top level tool calling distutils/setuptools than to use distutils as the top level tool. As a pythonic way, you may consider build tools, such as scons or waf, both written in python. But frankly, I would not bother too much about pythonic way: since some tools will be shell based anyway (autotools, packaging tools), and you don't have to care about windows, using make and shell may actually be easier. For the build part, you may take a look at my garnumpy package: it is essentially a set of rules for Gnu Makefiles, and it can build numpy + scipy, with a configurable set of dependencies (ATLAS, NETLIB BLAS/LAPACK, fftw, etc....). It can build both distutils and autotools-based packages. http://www.ar.media.kyoto-u.ac.jp/members/david/archives/garnumpy/garnumpy-0... I used it sucessfully on linux and cygwin, so I would say it should work on Mac OS X without too much trouble. The only thing which will be likely to be a pain is fat binaries (Universal). I use it to build a totally self contained numpy/scipy installation, which is a first step toward packaging. If you think it can be useful to you, don't hesitate to ask questions; there is also a bzr archive if you want to have access to the dev history of the tool cheers, David
![](https://secure.gravatar.com/avatar/39670e52a1a9e63db9652fdcf64390d7.jpg?s=120&d=mm&r=g)
David Cournapeau wrote:
Michael Hearne wrote:
I'm creating a Python application that is internal to my organization, but will be installed on both Linux and Mac OS X machines. The application depends heavily on a number of "non-pure" modules (those that have C/C++/FORTRAN components), like numpy, scipy, gdal, etc.
What is the most "pythonic" way to bundle these types of modules inside my application?
I've investigated dist_utils and setuptools, and I don't see an easy way with those tools to include build instructions for packages built with autconf tools.
Is my only recourse to write a custom install script that calls the "configure;make;make install" from a shell?
You have two problems: building and packaging. For packaging, autoconf will not help you much outside providing a coherence between packages, I think; distutils/setuptools have packaging tools, but I don't know how good they are.
On Linux, the packaging system depends on the distribution: since it is only internal, the problem is severely simplified, since you are likely to target only one format (rpm, deb, etc...). On Mac OS X, you have also tools to package .pkg and .mpkg (which are a set of .pkg), available freely on apple dev website; I have not used them much, but they look simple and easy to use for simple command line packages.
You could hack something in distutils to call configure and so on, but I don't think it will be pleasant. Building packages using distutils/setuptools is easy, and do not need much configuration. So I think it is more natural to use a top level tool calling distutils/setuptools than to use distutils as the top level tool. As a pythonic way, you may consider build tools, such as scons or waf, both written in python.
But frankly, I would not bother too much about pythonic way: since some tools will be shell based anyway (autotools, packaging tools), and you don't have to care about windows, using make and shell may actually be easier.
For the build part, you may take a look at my garnumpy package: it is essentially a set of rules for Gnu Makefiles, and it can build numpy + scipy, with a configurable set of dependencies (ATLAS, NETLIB BLAS/LAPACK, fftw, etc....). It can build both distutils and autotools-based packages.
http://www.ar.media.kyoto-u.ac.jp/members/david/archives/garnumpy/garnumpy-0...
I used it sucessfully on linux and cygwin, so I would say it should work on Mac OS X without too much trouble. The only thing which will be likely to be a pain is fat binaries (Universal). I use it to build a totally self contained numpy/scipy installation, which is a first step toward packaging. If you think it can be useful to you, don't hesitate to ask questions; there is also a bzr archive if you want to have access to the dev history of the tool
cheers,
David _______________________________________________ SciPy-user mailing list SciPy-user@scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user
Hi David, I am having a hell of a time with lapack/atlas. I already posted the issue I currently have with scipy after following the building doc in scipy web site (see http://projects.scipy.org/pipermail/scipy-user/2007-November/014506.html). Besides, octave build seems to also have serious issues with my lapack/atlas build. So I immediately downloaded youor package, but I seem to have a problem with the numpy download : make[3]: Leaving directory `/data1/sources/python/garnumpy-0.4/platform/numpy' # Change default path when looking for libs to fake dir, # so we can set everything by env variables cd work/main.d/numpy-1.0.3.1 && PYTHONPATH=/home/cohen/garnumpyinstall/lib/python2.5/site-packages:/home/cohen/garnumpyinstall/lib/python2.5/site-packages/gtk-2.0 /usr/bin/python \ setup.py config_fc --fcompiler=gnu config Traceback (most recent call last): File "setup.py", line 90, in <module> setup_package() File "setup.py", line 60, in setup_package from numpy.distutils.core import setup File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/__init__.py", line 39, in <module> import core File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/__init__.py", line 8, in <module> import numerictypes as nt File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/numerictypes.py", line 83, in <module> from numpy.core.multiarray import typeinfo, ndarray, array, empty, dtype ImportError: No module named multiarray make[2]: *** [configure-custom] Error 1 make[2]: Leaving directory `/data1/sources/python/garnumpy-0.4/platform/numpy' make[1]: *** [../../platform/numpy/cookies/main.d/install] Error 2 make[1]: Leaving directory `/data1/sources/python/garnumpy-0.4/platform/scipy' make: *** [imgdep-main] Error 2 Any idea? Your package is anyway a great idea. I would actually love to see it work, of course, but also allow for the possibility to point to already exisiting source directories for numpy and scipy (for instance from a previous svn checkout). I did not read the doc of your package so it might actually be there.... anyway, thanks in advance for your help. Johann
![](https://secure.gravatar.com/avatar/9820b5956634e5bbad7f4ed91a232822.jpg?s=120&d=mm&r=g)
Johann Cohen-Tanugi wrote:
David Cournapeau wrote:
Michael Hearne wrote:
I'm creating a Python application that is internal to my organization, but will be installed on both Linux and Mac OS X machines. The application depends heavily on a number of "non-pure" modules (those that have C/C++/FORTRAN components), like numpy, scipy, gdal, etc.
What is the most "pythonic" way to bundle these types of modules inside my application?
I've investigated dist_utils and setuptools, and I don't see an easy way with those tools to include build instructions for packages built with autconf tools.
Is my only recourse to write a custom install script that calls the "configure;make;make install" from a shell?
You have two problems: building and packaging. For packaging, autoconf will not help you much outside providing a coherence between packages, I think; distutils/setuptools have packaging tools, but I don't know how good they are.
On Linux, the packaging system depends on the distribution: since it is only internal, the problem is severely simplified, since you are likely to target only one format (rpm, deb, etc...). On Mac OS X, you have also tools to package .pkg and .mpkg (which are a set of .pkg), available freely on apple dev website; I have not used them much, but they look simple and easy to use for simple command line packages.
You could hack something in distutils to call configure and so on, but I don't think it will be pleasant. Building packages using distutils/setuptools is easy, and do not need much configuration. So I think it is more natural to use a top level tool calling distutils/setuptools than to use distutils as the top level tool. As a pythonic way, you may consider build tools, such as scons or waf, both written in python.
But frankly, I would not bother too much about pythonic way: since some tools will be shell based anyway (autotools, packaging tools), and you don't have to care about windows, using make and shell may actually be easier.
For the build part, you may take a look at my garnumpy package: it is essentially a set of rules for Gnu Makefiles, and it can build numpy + scipy, with a configurable set of dependencies (ATLAS, NETLIB BLAS/LAPACK, fftw, etc....). It can build both distutils and autotools-based packages.
http://www.ar.media.kyoto-u.ac.jp/members/david/archives/garnumpy/garnumpy-0...
I used it sucessfully on linux and cygwin, so I would say it should work on Mac OS X without too much trouble. The only thing which will be likely to be a pain is fat binaries (Universal). I use it to build a totally self contained numpy/scipy installation, which is a first step toward packaging. If you think it can be useful to you, don't hesitate to ask questions; there is also a bzr archive if you want to have access to the dev history of the tool
cheers,
David _______________________________________________ SciPy-user mailing list SciPy-user@scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user
Hi David, I am having a hell of a time with lapack/atlas. That's because they are far from trivial to build correctly. As always with build problems, it is never complicated, but problems are often difficult to track down. To make things worse, some linux distribution (including fedora and suse) did include bogus packages for blas and lapack. That's why I did this garnumpy thing in the first place: quickly build blas/lapack correctly, so that I can test things more easily. I already posted the issue I currently have with scipy after following the building doc in scipy web site (see http://projects.scipy.org/pipermail/scipy-user/2007-November/014506.html). Besides, octave build seems to also have serious issues with my lapack/atlas build. Which distribution are you using ?
So I immediately downloaded youor package, but I seem to have a problem with the numpy download : make[3]: Leaving directory `/data1/sources/python/garnumpy-0.4/platform/numpy' # Change default path when looking for libs to fake dir, # so we can set everything by env variables cd work/main.d/numpy-1.0.3.1 && PYTHONPATH=/home/cohen/garnumpyinstall/lib/python2.5/site-packages:/home/cohen/garnumpyinstall/lib/python2.5/site-packages/gtk-2.0 /usr/bin/python \ setup.py config_fc --fcompiler=gnu config Traceback (most recent call last): File "setup.py", line 90, in <module> setup_package() File "setup.py", line 60, in setup_package from numpy.distutils.core import setup File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/__init__.py", line 39, in <module> import core File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/__init__.py", line 8, in <module> import numerictypes as nt File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/numerictypes.py", line 83, in <module> from numpy.core.multiarray import typeinfo, ndarray, array, empty, dtype ImportError: No module named multiarray make[2]: *** [configure-custom] Error 1 make[2]: Leaving directory `/data1/sources/python/garnumpy-0.4/platform/numpy' make[1]: *** [../../platform/numpy/cookies/main.d/install] Error 2 make[1]: Leaving directory `/data1/sources/python/garnumpy-0.4/platform/scipy' make: *** [imgdep-main] Error 2
This is really strange, I have never seen this error. This is what I would do: - remove garnumpyinstall directory - go into bootstrap/lapack, and do make install - go into platform/numpy, do make install At this point, you should have an installed numpy in garnumpyinstall: test it (import numpy; numpy.test(level = 9999, verbosity = 9999). If this works, then go into platform/scipy. The errors should be clearer.
Any idea? Your package is anyway a great idea. I would actually love to see it work, of course, but also allow for the possibility to point to already exisiting source directories for numpy and scipy (for instance from a previous svn checkout). I did not read the doc of your package so it might actually be there.... You cannot use svn, because this prevents patching from working, and also there is a checksum when downloading: this is a major limitation (but not so major if you realize that the only thing you really want to use from svn are numpy and scipy; you can reuse blas/lapack/fftw/atlas, and that's my main use when testing on different platforms).
You can reuse downloaded archives, though: make garchive it will download all the tarballs (you cannot choose a subset, unfortunately) and put them into a directory which will be reused automatically afterwards. This means that even if you do make clean anywhere in the source tree, you won't need to download over and over the same things. cheers, David
![](https://secure.gravatar.com/avatar/39670e52a1a9e63db9652fdcf64390d7.jpg?s=120&d=mm&r=g)
David Cournapeau wrote:
Johann Cohen-Tanugi wrote:
David Cournapeau wrote:
Michael Hearne wrote:
I'm creating a Python application that is internal to my organization, but will be installed on both Linux and Mac OS X machines. The application depends heavily on a number of "non-pure" modules (those that have C/C++/FORTRAN components), like numpy, scipy, gdal, etc.
What is the most "pythonic" way to bundle these types of modules inside my application?
I've investigated dist_utils and setuptools, and I don't see an easy way with those tools to include build instructions for packages built with autconf tools.
Is my only recourse to write a custom install script that calls the "configure;make;make install" from a shell?
You have two problems: building and packaging. For packaging, autoconf will not help you much outside providing a coherence between packages, I think; distutils/setuptools have packaging tools, but I don't know how good they are.
On Linux, the packaging system depends on the distribution: since it is only internal, the problem is severely simplified, since you are likely to target only one format (rpm, deb, etc...). On Mac OS X, you have also tools to package .pkg and .mpkg (which are a set of .pkg), available freely on apple dev website; I have not used them much, but they look simple and easy to use for simple command line packages.
You could hack something in distutils to call configure and so on, but I don't think it will be pleasant. Building packages using distutils/setuptools is easy, and do not need much configuration. So I think it is more natural to use a top level tool calling distutils/setuptools than to use distutils as the top level tool. As a pythonic way, you may consider build tools, such as scons or waf, both written in python.
But frankly, I would not bother too much about pythonic way: since some tools will be shell based anyway (autotools, packaging tools), and you don't have to care about windows, using make and shell may actually be easier.
For the build part, you may take a look at my garnumpy package: it is essentially a set of rules for Gnu Makefiles, and it can build numpy + scipy, with a configurable set of dependencies (ATLAS, NETLIB BLAS/LAPACK, fftw, etc....). It can build both distutils and autotools-based packages.
http://www.ar.media.kyoto-u.ac.jp/members/david/archives/garnumpy/garnumpy-0...
I used it sucessfully on linux and cygwin, so I would say it should work on Mac OS X without too much trouble. The only thing which will be likely to be a pain is fat binaries (Universal). I use it to build a totally self contained numpy/scipy installation, which is a first step toward packaging. If you think it can be useful to you, don't hesitate to ask questions; there is also a bzr archive if you want to have access to the dev history of the tool
cheers,
David _______________________________________________ SciPy-user mailing list SciPy-user@scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user
Hi David, I am having a hell of a time with lapack/atlas.
That's because they are far from trivial to build correctly. As always with build problems, it is never complicated, but problems are often difficult to track down. To make things worse, some linux distribution (including fedora and suse) did include bogus packages for blas and lapack. That's why I did this garnumpy thing in the first place: quickly build blas/lapack correctly, so that I can test things more easily.
I already posted the issue I currently have with scipy after following the building doc in scipy web site (see http://projects.scipy.org/pipermail/scipy-user/2007-November/014506.html). Besides, octave build seems to also have serious issues with my lapack/atlas build.
Which distribution are you using ?
So I immediately downloaded youor package, but I seem to have a problem with the numpy download : make[3]: Leaving directory `/data1/sources/python/garnumpy-0.4/platform/numpy' # Change default path when looking for libs to fake dir, # so we can set everything by env variables cd work/main.d/numpy-1.0.3.1 && PYTHONPATH=/home/cohen/garnumpyinstall/lib/python2.5/site-packages:/home/cohen/garnumpyinstall/lib/python2.5/site-packages/gtk-2.0 /usr/bin/python \ setup.py config_fc --fcompiler=gnu config Traceback (most recent call last): File "setup.py", line 90, in <module> setup_package() File "setup.py", line 60, in setup_package from numpy.distutils.core import setup File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/__init__.py", line 39, in <module> import core File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/__init__.py", line 8, in <module> import numerictypes as nt File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/numerictypes.py", line 83, in <module> from numpy.core.multiarray import typeinfo, ndarray, array, empty, dtype ImportError: No module named multiarray make[2]: *** [configure-custom] Error 1 make[2]: Leaving directory `/data1/sources/python/garnumpy-0.4/platform/numpy' make[1]: *** [../../platform/numpy/cookies/main.d/install] Error 2 make[1]: Leaving directory `/data1/sources/python/garnumpy-0.4/platform/scipy' make: *** [imgdep-main] Error 2
This is really strange, I have never seen this error. This is what I would do: - remove garnumpyinstall directory - go into bootstrap/lapack, and do make install - go into platform/numpy, do make install At this point, you should have an installed numpy in garnumpyinstall: test it (import numpy; numpy.test(level = 9999, verbosity = 9999). If this works, then go into platform/scipy. The errors should be clearer.
do make install in platform/numpy fails exactly in the same way : [cohen@localhost numpy]$ make install [===== NOW BUILDING: numpy-1.0.3.1 =====] [fetch] complete for numpy. [checksum] complete for numpy. [extract] complete for numpy. [patch] complete for numpy. # Change default path when looking for libs to fake dir, # so we can set everything by env variables cd work/main.d/numpy-1.0.3.1 && PYTHONPATH=/home/cohen/garnumpyinstall/lib/python2.5/site-packages:/home/cohen/garnumpyinstall/lib/python2.5/site-packages/gtk-2.0 /usr/bin/python \ setup.py config_fc --fcompiler=gnu config Traceback (most recent call last): File "setup.py", line 90, in <module> setup_package() File "setup.py", line 60, in setup_package from numpy.distutils.core import setup File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/__init__.py", line 39, in <module> import core File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/__init__.py", line 8, in <module> import numerictypes as nt File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/numerictypes.py", line 83, in <module> from numpy.core.multiarray import typeinfo, ndarray, array, empty, dtype ImportError: No module named multiarray
Any idea? Your package is anyway a great idea. I would actually love to see it work, of course, but also allow for the possibility to point to already exisiting source directories for numpy and scipy (for instance from a previous svn checkout). I did not read the doc of your package so it might actually be there....
You cannot use svn, because this prevents patching from working, and also there is a checksum when downloading: this is a major limitation (but not so major if you realize that the only thing you really want to use from svn are numpy and scipy; you can reuse blas/lapack/fftw/atlas, and that's my main use when testing on different platforms).
You can reuse downloaded archives, though:
make garchive
it will download all the tarballs (you cannot choose a subset, unfortunately) and put them into a directory which will be reused automatically afterwards. This means that even if you do make clean anywhere in the source tree, you won't need to download over and over the same things.
cheers,
David
_______________________________________________ SciPy-user mailing list SciPy-user@scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user
![](https://secure.gravatar.com/avatar/9820b5956634e5bbad7f4ed91a232822.jpg?s=120&d=mm&r=g)
Johann Cohen-Tanugi wrote:
do make install in platform/numpy fails exactly in the same way :
I forgot to tell you to do make clean before, otherwise, it indeed has no way to fail in a different way :)
[cohen@localhost numpy]$ make install [===== NOW BUILDING: numpy-1.0.3.1 =====] [fetch] complete for numpy. [checksum] complete for numpy. [extract] complete for numpy. [patch] complete for numpy. # Change default path when looking for libs to fake dir, # so we can set everything by env variables cd work/main.d/numpy-1.0.3.1 && PYTHONPATH=/home/cohen/garnumpyinstall/lib/python2.5/site-packages:/home/cohen/garnumpyinstall/lib/python2.5/site-packages/gtk-2.0 /usr/bin/python \ setup.py config_fc --fcompiler=gnu config Traceback (most recent call last): File "setup.py", line 90, in <module> setup_package() File "setup.py", line 60, in setup_package from numpy.distutils.core import setup File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/__init__.py", line 39, in <module> import core File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/__init__.py", line 8, in <module> import numerictypes as nt File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/numerictypes.py", line 83, in <module> from numpy.core.multiarray import typeinfo, ndarray, array, empty, dtype ImportError: No module named multiarray I really don't understand this error: you are configuring numpy for compilation, so it should not try to import numpy.core (which has no way to succeed since it is not built yet). IOW, it is a bootstrap error (numpy build system tries to import numpy).
If after make clean, it still does not work, you may try to do: make clean make patch And then go into work/ work/main.d/numpy-1.0.3.1/, and just execute python setup.py config there, and give me the errors (you can also give them to me in private). David
![](https://secure.gravatar.com/avatar/95198572b00e5fbcd97fb5315215bf7a.jpg?s=120&d=mm&r=g)
On Nov 13, 2007 12:35 AM, David Cournapeau <david@ar.media.kyoto-u.ac.jp> wrote:
Johann Cohen-Tanugi wrote:
do make install in platform/numpy fails exactly in the same way :
I forgot to tell you to do make clean before, otherwise, it indeed has no way to fail in a different way :)
[cohen@localhost numpy]$ make install [===== NOW BUILDING: numpy-1.0.3.1 =====] [fetch] complete for numpy. [checksum] complete for numpy. [extract] complete for numpy. [patch] complete for numpy. # Change default path when looking for libs to fake dir, # so we can set everything by env variables cd work/main.d/numpy-1.0.3.1 && PYTHONPATH=/home/cohen/garnumpyinstall/lib/python2.5/site-packages:/home/cohen/garnumpyinstall/lib/python2.5/site-packages/gtk-2.0 /usr/bin/python \ setup.py config_fc --fcompiler=gnu config Traceback (most recent call last): File "setup.py", line 90, in <module> setup_package() File "setup.py", line 60, in setup_package from numpy.distutils.core import setup File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/__init__.py", line 39, in <module> import core File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/__init__.py", line 8, in <module> import numerictypes as nt File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/numerictypes.py", line 83, in <module> from numpy.core.multiarray import typeinfo, ndarray, array, empty, dtype ImportError: No module named multiarray I really don't understand this error: you are configuring numpy for compilation, so it should not try to import numpy.core (which has no way to succeed since it is not built yet). IOW, it is a bootstrap error (numpy build system tries to import numpy).
We finally tracked this down to a system-wide numpy confusing the brittle hack that __init__ was using to detect being run from the source directory. This commit: http://projects.scipy.org/scipy/numpy/changeset/4663 Fixes it and allows you to do an in-place build of numpy via python setup.py build_src --inplace python setup.py build_ext --inplace for development. It's a bit of an ugly hack, but at least it works robustly and avoids the problem above (which bit me today). Please let me know if the approach in the patch causes problems for any other types of usage. Cheers, f
![](https://secure.gravatar.com/avatar/6194b135cba546afa82516de1537de49.jpg?s=120&d=mm&r=g)
Hi, This might be a heratic idea (in this context): But how "bad" is it really, to just build all components, put them in one place, and distribute them as a zip/tar-archive ? PYTHONPATH would need to be adjusted as needed - either in a place like .bashrc or in 3-line script that would then substitute the "python"-call. This is of course least feancy, but might be most effective - especially if it is only for a releatively small group of people. Cheers, Sebastian Haase On Nov 12, 2007 9:26 PM, Michael Hearne <mhearne@usgs.gov> wrote:
I'm creating a Python application that is internal to my organization, but will be installed on both Linux and Mac OS X machines. The application depends heavily on a number of "non-pure" modules (those that have C/C++/FORTRAN components), like numpy, scipy, gdal, etc.
What is the most "pythonic" way to bundle these types of modules inside my application?
I've investigated dist_utils and setuptools, and I don't see an easy way with those tools to include build instructions for packages built with autconf tools.
Is my only recourse to write a custom install script that calls the "configure;make;make install" from a shell?
--Mike
------------------------------------------------------ Michael Hearne mhearne@usgs.gov (303) 273-8620 USGS National Earthquake Information Center 1711 Illinois St. Golden CO 80401 Senior Software Engineer Synergetics, Inc. ------------------------------------------------------
_______________________________________________ SciPy-user mailing list SciPy-user@scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user
participants (5)
-
David Cournapeau
-
Fernando Perez
-
Johann Cohen-Tanugi
-
Michael Hearne
-
Sebastian Haase