Building wheels - project metadata, and specifying compatibility
When building wheels, it is necessary to know details of the compatibility requirements of the code. The most common case is for pure Python code, where the code could in theory be valid for a single Python version, but in reality is more likely to be valid either for all Pythons, or sometimes for just Python 2 or Python 3 (where separate code bases or 2to3 are involved). The wheel project supports a "universal" flag in setup.cfg, which sets the compatibility flags to 'py2.py3', but that is only one case. Ultimately, we need a means (probably in metadata) for (pure Python) projects to specify any of the following: 1. The built code works on any version of Python (that the project supports) 2. The built code is specific to the major version of Python that it was built with 3. The built code is only usable for the precise Python version it was built with The default is currently (3), but this is arguably the least common case. Nearly all code will support at least (2) and more and more is supporting (1). Note that this is separate from the question of what versions the project supports. It's about how the code is written. Specifically, there's no point in marking code that uses new features in Python 3.3 as .py33 - it's still .py3 as it will work with Python 3.4. The fact that it won't work on Python 3.2 is just because the project doesn't support Python 3.2. Installing a .py3 wheel into Python 3.2 is no different from installing a sdist there. So overspecifying the wheel compatibility so that a sdist gets picked up for earlier versions isn't helpful. In addition to a means for projects to specify this themselves, tools (bdist_wheel, pip wheel) should probably have a means to override the default at the command line, as it will be some time before projects specify this information, even once it is standard. There's always the option to rename the generated file, but that feels like a hack... Where C extensions are involved, there are other questions. Mostly, compiled code is implementation, architecture, and minor version specific, so there's little to do here. The stable ABI is relevant, but I have no real experience of using it to know how that would work. There is also the case of projects with C accelerators - it would be good to be able to easily build both the accelerated version and a fallback pure-python wheel. I don't believe this is easy as things stand - distutils uses a compiler if it's present, so forcing a pure-python build when you have a compiler is harder work than it needs to be when building binary distributions. Comments? Should the default in bdist_wheel and pip wheel be changed or should it remain "as safe as possible" (practicality vs purity)? If the latter, should override flags be added, or is renaming the wheel in the absence of project metadata the recommended approach? And does anyone have any experience of how this might all work with C extensions? Paul
participants (2)
-
Daniel Holth
-
Paul Moore