Efficiently supporting multiple platforms in distribution packages

Hello, I just saw another topic posted, https://mail.python.org/pipermail/distutils-sig/2014-October/025140.html , which raises question similar to which I had in mind for some time. To not hijack that thread, I open a new one, but comment to a message from that thread, https://mail.python.org/pipermail/distutils-sig/2014-October/025142.html So, my case: support different platforms in one distribution package. To give more concrete example, let it be some module implemented using ctypes, with completely different implementation for Linux, MacOSX, and Windows. I'd also like to avoid installing files unneeded for particular platform (a usecase applies to MicroPython http://micropython.org/ , where there can be simply not enough storage space to install cruft). On Mon, 27 Oct 2014 14:04:38 +0000 Paul Moore <p.f.moore@...> wrote:
For a source distribution, you could play clever games in setup.py to put the right file in place, with the right name. But that's messy and it means that if you distribute wheels (not that there's much point in doing so) you need separate wheels for 2.6-, 2.7 and 3.3+.
Ok, so are there guidelines, best practices or at least example(s) how to do that? I pretty much would like to avoid inventing my own "clever games" to achieve that.
Alternatively, you could distribute all 3 files, as
dbf \ - __init__.py - dbf_26.py - dbf_27.py - dbf_3.py
Then in __init__.py do
if sys.version_info[0] == 3: from .dbf_3 import * elif sys.version_info[:2] == (2, 7): from .dbf_27 import * else from .dbf_26 import *
For our MicroPython case, we would like to avoid this due to aforementioned reasons. We could probably post-process installation dir after pip to remove unneeded files, but that sounds like hack - we'd rather do it fully on distribution package level, and target to install just a single source file per module, and avoid dispatcher like above (again, for efficiency reasons). There's also another issue - of supporting "cross-installs". Not all MicroPython targets support running pip natively, so it instead runs on a host computer, but would need be instructed to select source variants for a particular target platform. Any hints/pointers on how to achieve this - preferrably in "standard" way - are appreciated! -- Best regards, Paul mailto:pmiscml@gmail.com

On 27 October 2014 15:15, Paul Sokolovsky <pmiscml@gmail.com> wrote:
For a source distribution, you could play clever games in setup.py to put the right file in place, with the right name. But that's messy and it means that if you distribute wheels (not that there's much point in doing so) you need separate wheels for 2.6-, 2.7 and 3.3+.
Ok, so are there guidelines, best practices or at least example(s) how to do that? I pretty much would like to avoid inventing my own "clever games" to achieve that.
Personally, my guideline would be "don't do that". We're trying to move away from having complex code in setup.py, to a more nearly declarative solution. The above approach pretty much directly goes against that. I posted code later in the same thread, but I still don't think people should be doing it. I see the issue you and Ethan have, though. It should be considered in the context of Metadata 2.0 and things like that, so we make sure the use case is covered. Paul

The closest thing we have right now is 2to3. It produces different installed code depending on how setup.py was run, and if you produce wheels they have a tag to distinguish them from each other based on the Python version. It's not wrong to have a complicated build process in setup.py. The confusion is that setup.py both builds the package and generates its metadata. It's common for the list of dependencies in the metadata to change based on the version of Python. Instead, we would prefer that a single package has the same metadata independent of how it is built. The common different-dependencies-per-Python-version-or-OS case is supported with "environment markers". They can be used to turn dependencies on or off with simple expressions. The wheel project's own setup.py uses them. So my strategy would be to use a single source package to generate several wheels tagged per Python implementation or version. Later the installer would be able to pick the correct one based on its tags. As ever beware of trying to extend distutils. It's not very good at that.

Hello, On Mon, 27 Oct 2014 15:49:21 +0000 Paul Moore <p.f.moore@gmail.com> wrote:
On 27 October 2014 15:15, Paul Sokolovsky <pmiscml@gmail.com> wrote:
For a source distribution, you could play clever games in setup.py to put the right file in place, with the right name. But that's messy and it means that if you distribute wheels (not that there's much point in doing so) you need separate wheels for 2.6-, 2.7 and 3.3+.
Ok, so are there guidelines, best practices or at least example(s) how to do that? I pretty much would like to avoid inventing my own "clever games" to achieve that.
Personally, my guideline would be "don't do that". We're trying to move away from having complex code in setup.py, to a more nearly declarative solution.
It's great to hear there's move in declarative direction. And all the above is actually the reason I wrote "I don't want to invent my own way to do it". What I'd like to have is such declarative/data-driven way to select which files to install, based on some predefined (alternatively, user-defined) criteria. As current setuptools work in imperative manner, I essentially asked 1) how to code up conditional file installation with setuptools; 2) how to set my "custom" metadata/structure in such a way that it would be compatible with current community direction/best practices towards future fully declarative solution. Thanks for the code in the other message, it gives good hints for question 1. I still could use some guidelines on question 2 (or alternatively, as you write below, maybe you/other distutil-sig developers could use it when designing future solution). Let me give one example of possible many how these could be structured: So, in my src dir, I could have: os.py PLATFORM-macosx/os.py PLATFORM-pyboard/os.py Installer could see a special directory names starting with a set of predefined property names (uppercased to avoid easy clashing with user dirs). In this case, PLATFORM means sys.platform. If substring after hyphen matches its value, corresponding file is installed, otherwise file in the main dir is used as a fallback. So, would the schedule above be along the lines of your thinking how that should be done, or you're immediately see bunch of issues with it (I sure do - it's easy on packager's side, by not so easy on tool's side to figure what packager wanted, so more disambiguation surely would be required)? Then, would you have suggestion for better structure? If the intent of the above is still not clear enough, let me put it this way: with MicroPython, we won't be able to use pip in self-hosting mode (for various reasons), so we'd need to develop own lightweight package manager. It for sure would use declarative approach. So, we're looking for what kind of metadata and approach we should take to stay with general Python community's outlook of this? (And we want our packages to be still installable with current and future pip of course.) Thanks!
The above approach pretty much directly goes against that. I posted code later in the same thread, but I still don't think people should be doing it.
I see the issue you and Ethan have, though. It should be considered in the context of Metadata 2.0 and things like that, so we make sure the use case is covered.
Paul
-- Best regards, Paul mailto:pmiscml@gmail.com
participants (3)
-
Daniel Holth
-
Paul Moore
-
Paul Sokolovsky