How to build python-packages depends on the output of other project

hi, I'm writing python-binding for project A. My python-binding depends on the compile output of project A(It is a .so file), and the project A is not installed in the system(so we can't find the .so files in the system libraries pathes) What's the elegant way to package my python-binding, so that I can install everything by run `python setup.py` ? Any suggestions and comments will be appreciated :) -- Best wishes, Young

Hi, My current solution is like this I get the source code of project A. And use `cmdclass={"install": my_install},` in my setup function in setup.py. my_install is a subclass of `from setuptools.command.install import install` ``` class my_install(install): def run(self): # DO something I want. Such as compiling the code of project A and copy the output of it (i.e. the .so file) to my binding folder install.run(self) ``` At last I add these options in my setup function in setup.py to include the shared library in the install package. ``` package_dir={'my_binding_package': 'my_binding_folder'}, package_data={ 'my_binding_package': ['Shared_lib.so'], }, include_package_data=True, ``` But I think there should be better ways to achieve these. Could anyone give me any elegant examples to achieve the same goal? Thanks in advance On Thu, Jun 2, 2016 at 11:05 AM, Young Yang <afe.young@gmail.com> wrote:
hi,
I'm writing python-binding for project A.
My python-binding depends on the compile output of project A(It is a .so file), and the project A is not installed in the system(so we can't find the .so files in the system libraries pathes)
What's the elegant way to package my python-binding, so that I can install everything by run `python setup.py` ?
Any suggestions and comments will be appreciated :)
-- Best wishes, Young
-- Best wishes, Young Yang

On Fri, Jun 3, 2016 at 5:35 AM, Young Yang <afe.young@gmail.com> wrote:
my_install is a subclass of `from setuptools.command.install import install` ``` class my_install(install): def run(self): # DO something I want. Such as compiling the code of project A and copy the output of it (i.e. the .so file) to my binding folder install.run(self) ```
At last I add these options in my setup function in setup.py to include the shared library in the install package. ``` package_dir={'my_binding_package': 'my_binding_folder'}, package_data={ 'my_binding_package': ['Shared_lib.so'], }, include_package_data=True, ```
But I think there should be better ways to achieve these.
Overriding only the `install` will make bdist_wheel produce the wrong result. There's also the `develop` command. Some ideas about what commands you might need to override: https://github.com/pytest-dev/pytest-cov/blob/master/setup.py#L30-L63 An alternative approach would be to create a custom Extension class, check this https://github.com/cython/cython/tree/master/Cython/Distutils for ideas. Unfortunately the internals of distutils/setuptools aren't really well documented so you'll have to rely on examples, simply reading distutils code, coffee or even painkillers :-) Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro

I have some less elegant suggestions. In my ed25519ll package I abuse the Extension class to compile some source code (that is not a Python extension). This works if the C package you are compiling is simple enough that it can be built with the limited Extension interface. https://bitbucket.org/dholth/ed25519ll/src/37719c56b7b621a98dc694b109ccfca1c... For example setup( ... ext_modules=[ Extension('ed25519ll._ed25519_%s' % plat_name, sources=[ 'ed25519-supercop-ref10/ge_frombytes.c', # many more 'ed25519-supercop-ref10/py.c'], include_dirs=['ed25519-supercop-ref10', ], export_symbols=["crypto_sign", "crypto_sign_open", "crypto_sign_keypair"]) ) I added the file `py.c` with an empty function `void init_ed25519_win32() {}` to make the linker happy, and I list the symbols that need to be exported, otherwise those symbols would not be visible on Windows. Then I open the shared module with cffi or ctypes. Not very pretty but it works well enough. Another thing you can do without extending distutils that may not be immediately obvious is to run as much code as you want before calling setup(). It is even possible to install other Python modules by calling pip in a subprocess, then importing them, then calling setup(), all in the same file. On Fri, Jun 3, 2016 at 9:47 AM Ionel Cristian Mărieș < distutils-sig@python.org> wrote:
On Fri, Jun 3, 2016 at 5:35 AM, Young Yang <afe.young@gmail.com> wrote:
my_install is a subclass of `from setuptools.command.install import install` ``` class my_install(install): def run(self): # DO something I want. Such as compiling the code of project A and copy the output of it (i.e. the .so file) to my binding folder install.run(self) ```
At last I add these options in my setup function in setup.py to include the shared library in the install package. ``` package_dir={'my_binding_package': 'my_binding_folder'}, package_data={ 'my_binding_package': ['Shared_lib.so'], }, include_package_data=True, ```
But I think there should be better ways to achieve these.
Overriding only the `install` will make bdist_wheel produce the wrong result. There's also the `develop` command. Some ideas about what commands you might need to override: https://github.com/pytest-dev/pytest-cov/blob/master/setup.py#L30-L63
An alternative approach would be to create a custom Extension class, check this https://github.com/cython/cython/tree/master/Cython/Distutils for ideas.
Unfortunately the internals of distutils/setuptools aren't really well documented so you'll have to rely on examples, simply reading distutils code, coffee or even painkillers :-)
Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig

First, what you have is not all that inelegant -- it is the way to do it :-) But there are a few options when you are wrapping a C/C++ lib for python: Do you need to access that lib from other extensions or only from the one extension? IF others, then you pretty much need to build a shared lib and make sure that all your extension link to it. But if you only need to get to it from one extension than there are three options: 1) don't compile a lib -- rather, build all your C/C++ code to the extension itself. you can simply add the files to the "source" list -- for a straightforward lib, this is the easiest way to go. 2) statically link -- build the lib as a static lib, and then link it in to your extension. then there is no extra .so to keep track of and ship. at least on *nix you can bypass teh linker by passing teh static lib in as "extra_objects" -- I think. Something like that. 3) what you did -- build the .so and ship it with the extension. HTH, -Chris On Thu, Jun 2, 2016 at 7:35 PM, Young Yang <afe.young@gmail.com> wrote:
Hi,
My current solution is like this
I get the source code of project A. And use `cmdclass={"install": my_install},` in my setup function in setup.py.
my_install is a subclass of `from setuptools.command.install import install` ``` class my_install(install): def run(self): # DO something I want. Such as compiling the code of project A and copy the output of it (i.e. the .so file) to my binding folder install.run(self) ```
At last I add these options in my setup function in setup.py to include the shared library in the install package. ``` package_dir={'my_binding_package': 'my_binding_folder'}, package_data={ 'my_binding_package': ['Shared_lib.so'], }, include_package_data=True, ```
But I think there should be better ways to achieve these. Could anyone give me any elegant examples to achieve the same goal?
Thanks in advance
On Thu, Jun 2, 2016 at 11:05 AM, Young Yang <afe.young@gmail.com> wrote:
hi,
I'm writing python-binding for project A.
My python-binding depends on the compile output of project A(It is a .so file), and the project A is not installed in the system(so we can't find the .so files in the system libraries pathes)
What's the elegant way to package my python-binding, so that I can install everything by run `python setup.py` ?
Any suggestions and comments will be appreciated :)
-- Best wishes, Young
-- Best wishes, Young Yang
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
-- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

Thanks for your detailed reply. @Ionel Thanks for reminding me extending other classes! @Daniel @Chris Thanks for telling me the extension solution. I didn't know that before. But the project A I relied is complex one and has it's cmakelist.txt. I don't want to re-implement the compile process with the setup extension. The .so produce by project A will also used by other programs. I think the thing that makes it strange is that the the project A just compile the .so, but it doesn't install the .so to the system(such as /usr/local/lib/) where normal program can find. I plan to change my solution like this I put a `config.ini` file in my python-binding package. `config.ini` will contain the path to the .so produced by project A. If the user want to run setup.py, it will try to find the .so with the path in `config.ini` first. - If .so is found, the search result will prompt and the `config.ini` will be modified accordingly. - If not found, the setup.py will fail to run and prompt users that the `config.ini` should be configured before installation. Will this solution more elegant in this case? On Fri, Jun 3, 2016 at 11:01 PM, Chris Barker <chris.barker@noaa.gov> wrote:
First,
what you have is not all that inelegant -- it is the way to do it :-)
But there are a few options when you are wrapping a C/C++ lib for python:
Do you need to access that lib from other extensions or only from the one extension? IF others, then you pretty much need to build a shared lib and make sure that all your extension link to it. But if you only need to get to it from one extension than there are three options:
1) don't compile a lib -- rather, build all your C/C++ code to the extension itself. you can simply add the files to the "source" list -- for a straightforward lib, this is the easiest way to go.
2) statically link -- build the lib as a static lib, and then link it in to your extension. then there is no extra .so to keep track of and ship. at least on *nix you can bypass teh linker by passing teh static lib in as "extra_objects" -- I think. Something like that.
3) what you did -- build the .so and ship it with the extension.
HTH,
-Chris
On Thu, Jun 2, 2016 at 7:35 PM, Young Yang <afe.young@gmail.com> wrote:
Hi,
My current solution is like this
I get the source code of project A. And use `cmdclass={"install": my_install},` in my setup function in setup.py.
my_install is a subclass of `from setuptools.command.install import install` ``` class my_install(install): def run(self): # DO something I want. Such as compiling the code of project A and copy the output of it (i.e. the .so file) to my binding folder install.run(self) ```
At last I add these options in my setup function in setup.py to include the shared library in the install package. ``` package_dir={'my_binding_package': 'my_binding_folder'}, package_data={ 'my_binding_package': ['Shared_lib.so'], }, include_package_data=True, ```
But I think there should be better ways to achieve these. Could anyone give me any elegant examples to achieve the same goal?
Thanks in advance
On Thu, Jun 2, 2016 at 11:05 AM, Young Yang <afe.young@gmail.com> wrote:
hi,
I'm writing python-binding for project A.
My python-binding depends on the compile output of project A(It is a .so file), and the project A is not installed in the system(so we can't find the .so files in the system libraries pathes)
What's the elegant way to package my python-binding, so that I can install everything by run `python setup.py` ?
Any suggestions and comments will be appreciated :)
-- Best wishes, Young
-- Best wishes, Young Yang
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
--
Christopher Barker, Ph.D. Oceanographer
Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception
Chris.Barker@noaa.gov
-- Best wishes, 杨骁
participants (4)
-
Chris Barker
-
Daniel Holth
-
Ionel Cristian Mărieș
-
Young Yang