pip and missing shared system system library
pip install package often results in compiling (using gcc, g++, whatever) to produce a binary. Usually that proceeds without issue. However, there seems to be no checking that the libraries required to link that binary are already on the system. Or at least the message which results when they are not is not at all clear about what is missing. I discovered that today by wasting several hours figuring out why scanpy-scripts was failing trying to build dependency "louvain", which would not install into a venv with pip. It had something to do with "igraph", but pip had downloaded python-igraph before it got to louvain. When louvain tried to build there was a mysterious message about pkgconfig and igraph Cannot find the C core of igraph on this system using pkg-config. (Note that when python-igraph installs it places an igraph directory in site-packages, so which it is referring to is fairly ambiguous.) Then it tried to install a different version number of igraph, failed, and the install failed. This was very confusing because the second igraph install was not (it turned out) a different version of python-igraph but a system level igraph library, which it could not install either because the process was not privileged and could not write to the target directories. Yet it tried to install anyway. This is discussed in the louvain documentation here (it turns out): https://github.com/vtraag/louvain-igraph but since I was actually trying to install a different package, of course I had not read the louvain documentation. In short form the problem was "cannot build a binary because required library libigraph.so is not present in the operating system" but that was less than obvious in the barrage of warnings and error messages. Is it possible to tell pip or setup.py to fail immediately when a required system library like this is not found, here presumably after that "C core" message, rather than confusing the matter further with a failed partial build and install of the same component? More generally, is there anything in the python installation methods which could list system libraries as dependencies and give a more informative error message when they are missing? Thanks, David Mathog
I like the general idea. But I feel it's not going to be doable in practice. Many of the C libraries are not necessarily installed in usual places like `/usr/shared/lib` (like drivers for instances) or you can't be 100% sure about it. And that doesn't even account for Windows, which might behave quite a lot different. How about python package that comes with C libraries (compiled on install): numpy / tensorflow / torch / cupy / etc. I'm not against the idea. I just don't see a good way of doing it. For example, do you want to check on the system libraries or also `LD_LIBRARY_PATH` and `LIBRARY_PATH`. Do you want to check inside the user .bashrc for some modification of env vars (what if he doesn't bash) ? Sounds honestly difficult to design a feature like this, Jonathan ---- Le mer., 05 août 2020 16:03:40 -0700 David Mathog <dmathog@gmail.com> écrit ---- pip install package often results in compiling (using gcc, g++, whatever) to produce a binary. Usually that proceeds without issue. However, there seems to be no checking that the libraries required to link that binary are already on the system. Or at least the message which results when they are not is not at all clear about what is missing. I discovered that today by wasting several hours figuring out why scanpy-scripts was failing trying to build dependency "louvain", which would not install into a venv with pip. It had something to do with "igraph", but pip had downloaded python-igraph before it got to louvain. When louvain tried to build there was a mysterious message about pkgconfig and igraph Cannot find the C core of igraph on this system using pkg-config. (Note that when python-igraph installs it places an igraph directory in site-packages, so which it is referring to is fairly ambiguous.) Then it tried to install a different version number of igraph, failed, and the install failed. This was very confusing because the second igraph install was not (it turned out) a different version of python-igraph but a system level igraph library, which it could not install either because the process was not privileged and could not write to the target directories. Yet it tried to install anyway. This is discussed in the louvain documentation here (it turns out): https://github.com/vtraag/louvain-igraph but since I was actually trying to install a different package, of course I had not read the louvain documentation. In short form the problem was "cannot build a binary because required library libigraph.so is not present in the operating system" but that was less than obvious in the barrage of warnings and error messages. Is it possible to tell pip or setup.py to fail immediately when a required system library like this is not found, here presumably after that "C core" message, rather than confusing the matter further with a failed partial build and install of the same component? More generally, is there anything in the python installation methods which could list system libraries as dependencies and give a more informative error message when they are missing? Thanks, David Mathog -- Distutils-SIG mailing list -- mailto:distutils-sig@python.org To unsubscribe send an email to mailto:distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/MSS42...
Exactly. Python actually specifies metadata around this (Requires-External <https://packaging.python.org/specifications/core-metadata/#requires-external...>), but I don’t believe pip implements it at all since there’re almost no sensible rules available on how the external libraries can be located in a cross-platform way. Conda is probably the best bet when you need to deal with tight cross-language package integration like this, by punting the whole idea of system libraries and installing a separate copy of everything you need. -- Tzu-ping Chung (@uranusjr) uranusjr@gmail.com https://uranusjr.com
On 06/8/2020, at 07:25, Jonathan DEKHTIAR <contact@jonathandekhtiar.eu> wrote:
I like the general idea. But I feel it's not going to be doable in practice. Many of the C libraries are not necessarily installed in usual places like `/usr/shared/lib` (like drivers for instances) or you can't be 100% sure about it.
And that doesn't even account for Windows, which might behave quite a lot different. How about python package that comes with C libraries (compiled on install): numpy / tensorflow / torch / cupy / etc.
I'm not against the idea. I just don't see a good way of doing it. For example, do you want to check on the system libraries or also `LD_LIBRARY_PATH` and `LIBRARY_PATH`. Do you want to check inside the user .bashrc for some modification of env vars (what if he doesn't bash) ?
Sounds honestly difficult to design a feature like this,
Jonathan
---- Le mer., 05 août 2020 16:03:40 -0700 David Mathog <dmathog@gmail.com> écrit ----
pip install package
often results in compiling (using gcc, g++, whatever) to produce a binary. Usually that proceeds without issue. However, there seems to be no checking that the libraries required to link that binary are already on the system. Or at least the message which results when they are not is not at all clear about what is missing.
I discovered that today by wasting several hours figuring out why scanpy-scripts was failing trying to build dependency "louvain", which would not install into a venv with pip. It had something to do with "igraph", but pip had downloaded python-igraph before it got to louvain. When louvain tried to build there was a mysterious message about pkgconfig and igraph
Cannot find the C core of igraph on this system using pkg-config.
(Note that when python-igraph installs it places an igraph directory in site-packages, so which it is referring to is fairly ambiguous.) Then it tried to install a different version number of igraph, failed, and the install failed. This was very confusing because the second igraph install was not (it turned out) a different version of python-igraph but a system level igraph library, which it could not install either because the process was not privileged and could not write to the target directories. Yet it tried to install anyway. This is discussed in the louvain documentation here (it turns out):
https://github.com/vtraag/louvain-igraph <https://github.com/vtraag/louvain-igraph>
but since I was actually trying to install a different package, of course I had not read the louvain documentation.
In short form the problem was "cannot build a binary because required library libigraph.so is not present in the operating system" but that was less than obvious in the barrage of warnings and error messages.
Is it possible to tell pip or setup.py to fail immediately when a required system library like this is not found, here presumably after that "C core" message, rather than confusing the matter further with a failed partial build and install of the same component?
More generally, is there anything in the python installation methods which could list system libraries as dependencies and give a more informative error message when they are missing?
Thanks,
David Mathog -- Distutils-SIG mailing list -- distutils-sig@python.org <mailto:distutils-sig@python.org> To unsubscribe send an email to distutils-sig-leave@python.org <mailto:distutils-sig-leave@python.org> https://mail.python.org/mailman3/lists/distutils-sig.python.org/ <https://mail.python.org/mailman3/lists/distutils-sig.python.org/> Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/MSS42... <https://mail.python.org/archives/list/distutils-sig@python.org/message/MSS42...>
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/LSR26...
On Wed, Aug 5, 2020 at 5:05 PM Tzu-ping Chung <uranusjr@gmail.com> wrote:
Exactly. Python actually specifies metadata around this (Requires-External), but I don’t believe pip implements it at all since there’re almost no sensible rules available on how the external libraries can be located in a cross-platform way.
Locating the libraries would have to be platform specific, but pip could easily know to try pkgconfig on linux and if that fails run a tiny test which does nothing but attempt to link. If any of that fails then the package in question will likely fail too. Neither louvain nor python-igraph contain a Requires-External in their dist-info files. Looking at the setup.py for louvain here: https://github.com/vtraag/louvain-igraph/blob/master/setup.py around line 491 is the code for pkg-config and the "core" message . It looks like it should exit when pkg-config failed, and that is not what happened. That is 0.8.0, installed is 0.6.1. Pulled the later down with: pip3 download louvain==0.6.1 and unpacked it, and found starting at line 416 def detect_from_pkgconfig(self): """Detects the igraph include directory, library directory and the list of libraries to link to using ``pkg-config``.""" if not buildcfg.has_pkgconfig: print("Cannot find the C core of igraph on this system using pkg-config.") return False So as observed, it would not immediately abort when it could not find the installed library. This shows the problem with leaving Requires-External to each package's setup.py. Doing that means the warnings will differ from package to package, or possibly even version to version of the same package.
Conda is probably the best bet when you need to deal with tight cross-language package integration like this, by punting the whole idea of system libraries and installing a separate copy of everything you need.
I have been trying very hard NOT to have multiple copies of everything, hence my prior work on python_devirtualizer, which allows venv installs which are then unpacked the common pieces reduced to a single copy, and the "programs" wrapped so that they will start and run properly when they are found on PATH: https://sourceforge.net/projects/python-devirtualizer/ I suppose an equivalent set of scripts for "conda" would be possible, but I think much more difficult since it does more. Anyway, why is Requires-External (apparently) so little used? Is this a chicken/egg problem, where nobody specifies it because pip ignores it, and pip ignores it because nobody uses it? One can see how the Requires-External could be automatically generated. For instance, louvain has only one .so which might be processed starting something like this: ldd _c_louvain.cpython-36m-x86_64-linux-gnu.so | grep -v linux-vdso.so | grep -v ld-linux | grep -v libpython libigraph.so.0 => /lib64/libigraph.so.0 (0x00007f42bb622000) libstdc++.so.6 => /lib64/libstdc++.so.6 (0x00007f42bad42000) libm.so.6 => /lib64/libm.so.6 (0x00007f42ba9c0000) libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f42ba7a8000) libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f42ba588000) libc.so.6 => /lib64/libc.so.6 (0x00007f42ba1c6000) libxml2.so.2 => /lib64/libxml2.so.2 (0x00007f42b9e5e000) libz.so.1 => /lib64/libz.so.1 (0x00007f42b9c47000) liblzma.so.5 => /lib64/liblzma.so.5 (0x00007f42b9a20000) libdl.so.2 => /lib64/libdl.so.2 (0x00007f42b981c000) libgmp.so.10 => /lib64/libgmp.so.10 (0x00007f42b9584000) libcrypto.so.1.1 => /lib64/libcrypto.so.1.1 (0x00007f42b90a1000) libutil.so.1 => /lib64/libutil.so.1 (0x00007f42b8e9d000) which is processed to become: Requires-External: libigraph Requires-External: libstdc++ (etc) Requires-External: libutil For a more complicated package run the same method on all dynamic binaries and libraries and reduce the result to one copy of each. Determining versions would be harder though, perhaps impossible to do automatically. igraph on my system is 0.8.2, so that is sufficient, but there would be no way of knowing if 0.8.1 would also work, or if 0.9.0 would break things. Regards, David Mathog
On Thu, Aug 6, 2020 at 11:39 AM David Mathog <dmathog@gmail.com> wrote:
Looking at the setup.py for louvain here:
https://github.com/vtraag/louvain-igraph/blob/master/setup.py
around line 491 is the code for pkg-config and the "core" message . It looks like it should exit when pkg-config failed, and that is not what happened.
If the code that failed to give a good error message is in louvain-igraph, then you should probably talk to them about that :-). There's no way for the core packaging libraries to guess what this kind of arbitrary package-specific code is going to do. -n -- Nathaniel J. Smith -- https://vorpus.org
On Thu, Aug 6, 2020 at 11:54 AM Nathaniel Smith <njs@pobox.com> wrote:
If the code that failed to give a good error message is in louvain-igraph, then you should probably talk to them about that :-). There's no way for the core packaging libraries to guess what this kind of arbitrary package-specific code is going to do.
That was the point I was trying to make, albeit not very well I guess. Because Requires-External was not supplied, and pip would not have done anything with it even if it had been, the package had to roll its own. The documentation for Requires-External says what it requires, but it does not indicate that anything else happens besides (I assume) the installation halting if the condition is not met. That is, if there is: Requires-External: libpng and pip acts on it that meant it found libpng.so, but there does not seem to be any requirement that it communicate any further information about libpng to setup.py in any standard way. Which is why the setup.py for louvain rolled its own. For posixy OS's it would be sufficient to know that if the "Requires-External" passed that "pkg-config --cflags libpng" and the like will work. But again, that pushes the work into setup.py where it will not be standardized nor platform agnostic. So for better portability passing one of these tests should also set some standard variables like: RE_libpng_cflags="-lpng16 -lz" RE_libpng_includedir="/usr/include" RE_libpng_libdir="/usr/lib64" (and so forth). which are then seen in setup.py. Yes, these are just the various values already in the libpng.pc file, no reason to reinvent that wheel. The result should be simpler setup.py's which are portable without requiring all the conditional "if it is this OS then look here" that they must currently contain. Regards, David Mathog
On Thu, Aug 6, 2020 at 3:06 PM David Mathog <dmathog@gmail.com> wrote:
On Thu, Aug 6, 2020 at 11:54 AM Nathaniel Smith <njs@pobox.com> wrote:
If the code that failed to give a good error message is in louvain-igraph, then you should probably talk to them about that :-). There's no way for the core packaging libraries to guess what this kind of arbitrary package-specific code is going to do.
That was the point I was trying to make, albeit not very well I guess. Because Requires-External was not supplied, and pip would not have done anything with it even if it had been, the package had to roll its own. The documentation for Requires-External says what it requires, but it does not indicate that anything else happens besides (I assume) the installation halting if the condition is not met. That is, if there is:
Requires-External: libpng
and pip acts on it that meant it found libpng.so, but there does not seem to be any requirement that it communicate any further information about libpng to setup.py in any standard way. Which is why the setup.py for louvain rolled its own. For posixy OS's it would be sufficient to know that if the "Requires-External" passed that "pkg-config --cflags libpng" and the like will work. But again, that pushes the work into setup.py where it will not be standardized nor platform agnostic. So for better portability passing one of these tests should also set some standard variables like:
RE_libpng_cflags="-lpng16 -lz" RE_libpng_includedir="/usr/include" RE_libpng_libdir="/usr/lib64" (and so forth).
which are then seen in setup.py. Yes, these are just the various values already in the libpng.pc file, no reason to reinvent that wheel. The result should be simpler setup.py's which are portable without requiring all the conditional "if it is this OS then look here" that they must currently contain.
Unfortunately, successfully building C libraries is way, way more complicated than that. There are nearly as many ways to detect and configure C libraries as there are C libraries; tools like pkg-config help a bit but they're far from universal. There can be multiple versions of libpng on the same system, with different ABIs. pip doesn't even know what compiler the package will want to use (which also affects which libraries are available). And at the end of the day, the only thing pip could do with this information is print a slightly nicer error message than you would get otherwise. What pip *has* done in the last few years is made it possible to pull in packages from PyPI when building packages from source, so you can make your own pkg-config-handling library and put it on PyPI and encourage everyone to use it instead of reinventing the wheel. Or use more powerful build systems that have already solved these problems, e.g. scikit-build lets you use CMake to build python packages. -n -- Nathaniel J. Smith -- https://vorpus.org
Unfortunately, successfully building C libraries is way, way more complicated than that. There are nearly as many ways to detect and configure C libraries as there are C libraries; tools like pkg-config help a bit but they're far from universal.
Agreed that building a library is more complicated. (Building a library, or anything for that matter, which depends on boost is even worse.) Nevertheless, to do so the information provided by pkg-config will always be required. It might not be sufficient, of course. As for looking up this information, I am only aware of pkg-config and pkgconf, and on many systems one is just a soft link to the other. That is also what is used on Windows within Mingw. So it would not be unreasonable to specify that this is the source of the information in all Posix environments.
There can be multiple versions of libpng on the same system, with different ABIs.
Requires-External supports version ranges and pkg-config will show the version which is installed. If Requires-External is to ever have a real usage presumably it would have to be compatible with pkg-config in Posix environments. That is, how would it ever work otherwise? Users who have placed pc files in odd locations would have to modify PKG_CONFIG_PATH before running pip or these would not be found. They would also have to specify "libname" or "libname2", as appropriate, in some cases.
doesn't even know what compiler the package will want to use (which also affects which libraries are available).
I had wondered about that. In the spec it has an example: Requires-External C which seems to be a requirement for a C compiler, but if it does not specify which one, then the test could pass if it finds the Intel compiler even though setup.py only knows how to build with gcc. Or vice versa.
day, the only thing pip could do with this information is print a slightly nicer error message than you would get otherwise.
In the case that started this thread a simple "The igraph library is required but not installed on this operating system" and then exit would have saved a considerable amount of time. So while it isn't much, it is more than we have currently.
What pip *has* done in the last few years is made it possible to pull in packages from PyPI when building packages from source, so you can make your own pkg-config-handling library and put it on PyPI and encourage everyone to use it instead of reinventing the wheel. Or use more powerful build systems that have already solved these problems, e.g. scikit-build lets you use CMake to build python packages.
I think that is what happened this time, but there was no test to see if the package it built could be installed where it wanted to put it, so it failed. At least I think that is what happened. In any case, it did pull igraph down from PyPI but the installation failed. One other point about "Requires-External" - as described, it lacks a special case "none". (None really means "just the python version which is running pip".) That is, there is currently no way to distinguish between "this package has no external requirements" and "the external requirement specification is incomplete". This information really should be mandatory, even if it is just to tell a person what must be installed in the OS before running pip. One can imagine a utility analogous to "johnnydep" which would traverse a proposed package install and verify that all the Requires-External entries are in fact satisfied, or minimally, just list them. Pip should warn when no "Requires-External" entries are present, and "Requires-External none" would always suppress that warning. Regards, David Mathog
If you want to push a little further. Often when you build an extension which statically or dynamically link another library. You will need to use the same compiler. Otherwise you might bump into ABI comptability issues.So do you plan on "managing" which version of GCC or g++ people have and issue a warning if they don't have the good one? How are you even supposed to find out? Many library will expose a flag that tells you which compiler version or range of version can be used. But no standard on this matter...Don't get me wrong would be awesome if it worked. I just don't see a way to handle all these contraints ...Jonathan ---- On ven., 07 août 2020 10:28:50 -0700 David Mathog<dmathog@gmail.com> wrote ----> Unfortunately, successfully building C libraries is way, way more > complicated than that. There are nearly as many ways to detect and > configure C libraries as there are C libraries; tools like pkg-config > help a bit but they're far from universal. Agreed that building a library is more complicated. (Building a library, or anything for that matter, which depends on boost is even worse.) Nevertheless, to do so the information provided by pkg-config will always be required. It might not be sufficient, of course. As for looking up this information, I am only aware of pkg-config and pkgconf, and on many systems one is just a soft link to the other. That is also what is used on Windows within Mingw. So it would not be unreasonable to specify that this is the source of the information in all Posix environments. > There can be multiple versions of libpng on the same system, with different ABIs. Requires-External supports version ranges and pkg-config will show the version which is installed. If Requires-External is to ever have a real usage presumably it would have to be compatible with pkg-config in Posix environments. That is, how would it ever work otherwise? Users who have placed pc files in odd locations would have to modify PKG_CONFIG_PATH before running pip or these would not be found. They would also have to specify "libname" or "libname2", as appropriate, in some cases. > doesn't even know what compiler the package will want to use (which > also affects which libraries are available). I had wondered about that. In the spec it has an example: Requires-External C which seems to be a requirement for a C compiler, but if it does not specify which one, then the test could pass if it finds the Intel compiler even though setup.py only knows how to build with gcc. Or vice versa. > day, the only thing pip could do with this information is print a > slightly nicer error message than you would get otherwise. In the case that started this thread a simple "The igraph library is required but not installed on this operating system" and then exit would have saved a considerable amount of time. So while it isn't much, it is more than we have currently. > What pip *has* done in the last few years is made it possible to pull > in packages from PyPI when building packages from source, so you can > make your own pkg-config-handling library and put it on PyPI and > encourage everyone to use it instead of reinventing the wheel. Or use > more powerful build systems that have already solved these problems, > e.g. scikit-build lets you use CMake to build python packages. I think that is what happened this time, but there was no test to see if the package it built could be installed where it wanted to put it, so it failed. At least I think that is what happened. In any case, it did pull igraph down from PyPI but the installation failed. One other point about "Requires-External" - as described, it lacks a special case "none". (None really means "just the python version which is running pip".) That is, there is currently no way to distinguish between "this package has no external requirements" and "the external requirement specification is incomplete". This information really should be mandatory, even if it is just to tell a person what must be installed in the OS before running pip. One can imagine a utility analogous to "johnnydep" which would traverse a proposed package install and verify that all the Requires-External entries are in fact satisfied, or minimally, just list them. Pip should warn when no "Requires-External" entries are present, and "Requires-External none" would always suppress that warning. Regards, David Mathog -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/BMLRH...
On Sat, Aug 8, 2020 at 8:15 PM Jonathan DEKHTIAR <contact@jonathandekhtiar.eu> wrote:
So do you plan on "managing" which version of GCC or g++ people have and issue a warning if they don't have the good one?
A setup.py will always be written for a particular compiler, or maybe it will handle a couple, but they never handle a "general compiler". That was why the example in spec Requires-External C never made sense. It always should have been something like Requires-External gcc (>4.0) There is no logic available at that level, as far as I can tell. So if a package needed gcc on Posix or an MS compiler on windows how would one specify that? For that matter, if it could use either gcc or Intel's compiler on Posix how would that be indicated? Maybe there is some specification level logic which can be used to wrap these statements?
How are you even supposed to find out?
pkg-config, in any Posix environment. Within a pure Windows environment or on some obscure OS, I have no idea. Just skip this test if it is not supported in a given environment? Better that it works in some environments than in none.
Don't get me wrong would be awesome if it worked. I just don't see a way to handle all these contraints ...
I would be happy if it handled _any_ of these constraints. At the moment adding these lines does nothing. Regards, David Mathog
On Aug 9, 2020, at 12:59, David Mathog <dmathog@gmail.com> wrote:
How are you even supposed to find out?
pkg-config, in any Posix environment. Within a pure Windows environment or on some obscure OS, I have no idea. Just skip this test if it is not supported in a given environment? Better that it works in some environments than in none.
Just to be clear, pkg-config is not part of any Posix standard, AFAIK, so you cannot depend on it being available. For example, Apple does not include pkg-config in macOS releases (although third-party package distributors like Homebrew and MacPorts do optionally provide it for their packages). And use of pkg-config doesn't solve finding dependencies at run time, which may be a different environment than that at build time. -- Ned Deily nad@python.org -- []
On Sun, Aug 9, 2020 at 10:21 AM Ned Deily <nad@python.org> wrote:
Just to be clear, pkg-config is not part of any Posix standard, AFAIK, so you cannot depend on it being available.
Understood. However, if that is not employed what reasonable method remains for implementing "Requires-External"? The only thing I can think of is to specify exact library or program names, like Requires-External gcc Requires-External libpng.so and those could be found by searching the whole directory tree. That might even be efficient if updatedb/locate are available. However going that way, how would one determine version compatibility on a library? Doing it through the package manager may be possible, but it is a multistep process: 1. lookup libpng.so -> PATHPNG 2. rpm -q --whatprovides $PATHPNG -> name of package 3. analyze "name of package" for version information Much easier one suspects to install pkg-config on systems which do not yet have it than to completely reimplement it. Does OS X have something which is equivalent to pkg-config, or is there just no way to look up this sort of information on that OS? Regards, David Mathog
Are you requesting an implementation of autotools / autoconf / pkg-config / libtool in Python, in setuptools? Existing workarounds for building and distributing portable binaries: W/ shared library dependencies: - auditwheel & manylinux - package managers which support arbitrary binary packages in languages other than python: - conda - RPM / DEB / ... - bdist_rpm - bdist_deb - FPM W/ static dependencies: - zipapp - bazel / buck build / pants build (BUILD files) - py2app, py2exe, pyinstaller, https://github.com/vinta/awesome-python#distribution On Sun, Aug 9, 2020, 3:05 PM David Mathog <dmathog@gmail.com> wrote:
On Sun, Aug 9, 2020 at 10:21 AM Ned Deily <nad@python.org> wrote:
Just to be clear, pkg-config is not part of any Posix standard, AFAIK, so you cannot depend on it being available.
Understood. However, if that is not employed what reasonable method remains for implementing "Requires-External"? The only thing I can think of is to specify exact library or program names, like
Requires-External gcc Requires-External libpng.so
and those could be found by searching the whole directory tree. That might even be efficient if updatedb/locate are available. However going that way, how would one determine version compatibility on a library? Doing it through the package manager may be possible, but it is a multistep process:
1. lookup libpng.so -> PATHPNG 2. rpm -q --whatprovides $PATHPNG -> name of package 3. analyze "name of package" for version information
Much easier one suspects to install pkg-config on systems which do not yet have it than to completely reimplement it.
Does OS X have something which is equivalent to pkg-config, or is there just no way to look up this sort of information on that OS?
Regards,
David Mathog -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/BCYVP...
On Sun, Aug 9, 2020, 3:28 PM Wes Turner <wes.turner@gmail.com> wrote:
Are you requesting an implementation of autotools / autoconf / pkg-config / libtool in Python, in setuptools?
Existing workarounds for building and distributing portable binaries:
W/ shared library dependencies: - auditwheel & manylinux
""" `auditwheel show`: shows external shared libraries that the wheel depends on (beyond the libraries included in the manylinux policies), and checks the extension modules for the use of versioned symbols that exceed the manylinux ABI. `auditwheel repair`: copies these external shared libraries into the wheel itself, and automatically modifies the appropriate RPATH entries such that these libraries will be picked up at runtime. This accomplishes a similar result as if the libraries had been statically linked without requiring changes to the build system. Packagers are advised that bundling, like static linking, may implicate copyright concerns. """ https://github.com/pypa/auditwheel#overview - package managers which support arbitrary binary packages in languages
other than python: - conda - RPM / DEB / ... - bdist_rpm - bdist_deb - FPM
W/ static dependencies: - zipapp - bazel / buck build / pants build (BUILD files) - py2app, py2exe, pyinstaller, https://github.com/vinta/awesome-python#distribution
On Sun, Aug 9, 2020, 3:05 PM David Mathog <dmathog@gmail.com> wrote:
On Sun, Aug 9, 2020 at 10:21 AM Ned Deily <nad@python.org> wrote:
Just to be clear, pkg-config is not part of any Posix standard, AFAIK, so you cannot depend on it being available.
Understood. However, if that is not employed what reasonable method remains for implementing "Requires-External"? The only thing I can think of is to specify exact library or program names, like
Requires-External gcc Requires-External libpng.so
and those could be found by searching the whole directory tree. That might even be efficient if updatedb/locate are available. However going that way, how would one determine version compatibility on a library? Doing it through the package manager may be possible, but it is a multistep process:
1. lookup libpng.so -> PATHPNG 2. rpm -q --whatprovides $PATHPNG -> name of package 3. analyze "name of package" for version information
Much easier one suspects to install pkg-config on systems which do not yet have it than to completely reimplement it.
Does OS X have something which is equivalent to pkg-config, or is there just no way to look up this sort of information on that OS?
Regards,
David Mathog -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/BCYVP...
On 9 Aug 2020, at 18:59, David Mathog <dmathog@gmail.com> wrote:
On Sat, Aug 8, 2020 at 8:15 PM Jonathan DEKHTIAR <contact@jonathandekhtiar.eu> wrote:
So do you plan on "managing" which version of GCC or g++ people have and issue a warning if they don't have the good one?
A setup.py will always be written for a particular compiler, or maybe it will handle a couple, but they never handle a "general compiler".
Except that almost all extensions written in C require a “general C compiler”, not some version of GCC.
That was why the example in spec
Requires-External C
never made sense. It always should have been something like
Requires-External gcc (>4.0)
Not unless you write code that uses features specific to GCC, and even then it is questionable as there are several other compilers that implement a large subset of GCC language extensions (at least icc and clang). Ronald — Twitter / micro.blog: @ronaldoussoren Blog: https://blog.ronaldoussoren.net/
I hope it won't be obscure. But sometimes, you **need** to compile with the same compiler even if you don't exploit specific compiler features. Namely for "Bug Compatibility" reasons: https://en.wikipedia.org/wiki/Bug_compatibility When you work with low level libraries and tend to access hardware or driver libraries (think GPUs, FPGAs, etc.) in many cases, compiling with GCC or CLANG or whatever compiler has been used to compile the libraries you are using is essential. This is a very common issues when working on specific hardware or embedded software. Compilers are not "perfect" they have bugs like any other pieces (I know we tend to forget it). Sometimes we don't care, sometimes we do. And actually now that Deep Learning and Machine learning is really a big thing. "Bug Compatibility" is an important issue. Try compiling CUDA code with the wrong compiler: I wish you good luck in your debugging :D Jonathan ---- Le lun., 10 août 2020 00:07:54 -0700 Ronald Oussoren via Distutils-SIG <distutils-sig@python.org> écrit ----
On 9 Aug 2020, at 18:59, David Mathog <mailto:dmathog@gmail.com> wrote:
On Sat, Aug 8, 2020 at 8:15 PM Jonathan DEKHTIAR <mailto:contact@jonathandekhtiar.eu> wrote:
So do you plan on "managing" which version of GCC or g++ people have and issue a warning if they don't have the good one?
A setup.py will always be written for a particular compiler, or maybe it will handle a couple, but they never handle a "general compiler".
Except that almost all extensions written in C require a “general C compiler”, not some version of GCC.
That was why the example in spec
Requires-External C
never made sense. It always should have been something like
Requires-External gcc (>4.0)
Not unless you write code that uses features specific to GCC, and even then it is questionable as there are several other compilers that implement a large subset of GCC language extensions (at least icc and clang). Ronald — Twitter / micro.blog: @ronaldoussoren Blog: https://blog.ronaldoussoren.net/ -- Distutils-SIG mailing list -- mailto:distutils-sig@python.org To unsubscribe send an email to mailto:distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/MPG2S...
On 10 Aug 2020, at 09:31, Jonathan DEKHTIAR <contact@jonathandekhtiar.eu> wrote:
I hope it won't be obscure. But sometimes, you **need** to compile with the same compiler even if you don't exploit specific compiler features. Namely for "Bug Compatibility" reasons: https://en.wikipedia.org/wiki/Bug_compatibility <https://en.wikipedia.org/wiki/Bug_compatibility>
When you work with low level libraries and tend to access hardware or driver libraries (think GPUs, FPGAs, etc.) in many cases, compiling with GCC or CLANG or whatever compiler has been used to compile the libraries you are using is essential. This is a very common issues when working on specific hardware or embedded software. Compilers are not "perfect" they have bugs like any other pieces (I know we tend to forget it). Sometimes we don't care, sometimes we do. And actually now that Deep Learning and Machine learning is really a big thing. "Bug Compatibility" is an important issue. Try compiling CUDA code with the wrong compiler: I wish you good luck in your debugging :D
CUDA and embedded systems are not most extensions. For a lot of software the specific compiler used is not an issue, except for compatibility (C++ ABI, runtime ABI for the C runtime). Note that I reacted to a statement that a setup.py is always written for a specific compiler, which is untrue. Most setup.py files with extensions don’t specify a compiler at all, but rely on the compiler detection from distutils/setuptools. Ronald
Jonathan
---- Le lun., 10 août 2020 00:07:54 -0700 Ronald Oussoren via Distutils-SIG <distutils-sig@python.org> écrit ----
On 9 Aug 2020, at 18:59, David Mathog <dmathog@gmail.com <mailto:dmathog@gmail.com>> wrote:
On Sat, Aug 8, 2020 at 8:15 PM Jonathan DEKHTIAR <contact@jonathandekhtiar.eu <mailto:contact@jonathandekhtiar.eu>> wrote:
So do you plan on "managing" which version of GCC or g++ people have and issue a warning if they don't have the good one?
A setup.py will always be written for a particular compiler, or maybe it will handle a couple, but they never handle a "general compiler".
Except that almost all extensions written in C require a “general C compiler”, not some version of GCC.
That was why the example in spec
Requires-External C
never made sense. It always should have been something like
Requires-External gcc (>4.0)
Not unless you write code that uses features specific to GCC, and even then it is questionable as there are several other compilers that implement a large subset of GCC language extensions (at least icc and clang).
Ronald
—
Twitter / micro.blog: @ronaldoussoren Blog: https://blog.ronaldoussoren.net/ <https://blog.ronaldoussoren.net/> -- Distutils-SIG mailing list -- distutils-sig@python.org <mailto:distutils-sig@python.org> To unsubscribe send an email to distutils-sig-leave@python.org <mailto:distutils-sig-leave@python.org> https://mail.python.org/mailman3/lists/distutils-sig.python.org/ <https://mail.python.org/mailman3/lists/distutils-sig.python.org/> Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/MPG2S... <https://mail.python.org/archives/list/distutils-sig@python.org/message/MPG2S...>
I am getting error or setup.py is missing. is there any possible solution for that ? detail of issue is below here : https://agrtechnologies.com/eng/hotmail-login/
participants (8)
-
David Mathog
-
Jonathan DEKHTIAR
-
markjohn123123@hotmail.com
-
Nathaniel Smith
-
Ned Deily
-
Ronald Oussoren
-
Tzu-ping Chung
-
Wes Turner