My one technical issue is with going beyond zipimport behaviour to the point of extracting DLLs to the filesystem. I remain -1 on that feature, and I believe I have explained why I think there are issues (and why I think that any solution should be part of zipimport and not added on in library or user code). But I'm happy to go through the details again, if you like - or just to accept that I don't
Yes please, let's get into some details. Of course I understand that you might not want to use the feature, but I don't understand the -1 on the feature per se - whether it is in distlib or in zipimport is a secondary consideration. I agree that zipimport is the logical place for it, but ISTM the reason why it can't go in there just yet is also the reason why one might have some reservations about the feature: binary compatibility. I accept that this not yet a fully resolved issue in general (cf. the parallel discussion about numpy), but if we can isolate these issues, we can perhaps tackle them. But for me, that's the main reason why this part of the distlib API is experimental. Since zipimport makes no prescription about the contents of a zip (beyond describing how importing works), there is no agreed place to place metadata information such as about binary compatibility. Nor is binary compatibility between third-party packages of core concern to python-dev, so I'm not sure discussions there will be fruitful. However, such compatibility is a valid concern here, so I would expect useful input to come from interested parties. Also, the wheel format already caters for a limited set of binary compatibility info to be communicated, but this information is incomplete, which results in reservations about the dangers of extracting C extensions. In a sense, the contentiousness of extraction of these from a wheel is a red herring, because the exact same issues would arise if you installed from the wheel and then tried to use software which purported to be binary compatible with your system, but wasn't. That's why we don't put Linux wheels on PyPI, right? It's to avoid binary wheels compiled on e.g. CentOS ending up on an e.g. Ubuntu system which is not quite binary compatible. But that doesn't solve the problem at source, so much as act as a prophylactic. Much of the Python community works in a POSIX environment where build- from-source is the norm and binary compatibility is a non-issue. In a corporate environment with a homogeneous infrastructure, the same applies; but in a heterogeneous environment, not so much. I think some benefit would accrue to packaging as a whole if we had better definitions of binary compatibility, the ability to express it in more detail, how to test for it at installation time and so on. I think this is one of the areas where we can and should improve WHEEL metadata. Perhaps the NumPy/SciPy readers would care to comment, as we're talking beyond the realms of Python version compatibility. If you have other reasons for your -1, I'd like to hear them.
need to use the feature. (Would you be willing to add some sort of "never extract C extensions regardless of what the metadata might say" option to wheel mount? It's not critical, as mounting random wheels without knowing how they work is clearly bad, but it does add a level of assurance that might be helpful,)
Currently, extraction only happens if there is metadata in the wheel to indicate that extraction should occur, and it's up to the wheel builder to put it there. It doesn't make sense in general to prohibit just the C extensions but allow importing pure-Python modules in a wheel: the pure-Python bits mightn't work properly if the C extensions aren't available. So it seems safer in general to have all-or-nothing, or else have an additional flag in the metadata which indicates that the C extensions have to be extracted for the wheel to be useful. End user control of mounting should be at the discretion of the developer of the software which invokes the mount method. Regards, Vinay Sajip