Re: [Python-Dev] Status of C compilers for Python on Windows

I'm several weeks late to this discussion, but I'm glad to see that it happened. I'm not a Python developer, and barely a user, but I have several years of daily experience compiling complicated scientific software cross- platform, particularly with MinGW-w64 for Windows. The Python community, both core language and scientific package developers and users, needs to act here. The problem is bad and getting worse. Luckily much of the work to start solving it has already been done in bits and pieces, it needs coordination and participation to come to a conclusion.
Cross compilation is a valid issue, but I hope that build services like Appveyor also help out here. There is regular talk about the PSF/PyPI providing something similar
AppVeyor is better than nothing (I've been using it since beta), but it's a far cry from build services and package management as the Linux world knows them. Obtaining and setting up build dependencies quickly and repeatably, and finishing the build of a complicated piece of software such as CPython, or NumPy, SciPy, Julia (where most of my recent expertise lies), etc. on a small single-core VM with limited memory and a restrictive time limit is often not possible. These problems are solved within Linux infrastructure like Koji, Open Build Service, buildd, etc. MinGW-w64 is a mature, well-tested toolchain that is very capable of cross- compiling a wide variety of libraries from Linux to Windows, in addition to building conventionally on Windows for Windows. The MSYS2 collection of MinGW-w64-compiled packages (https://github.com/Alexpux/MINGW-packages) has been mentioned. Linux distributions including - Fedora https://admin.fedoraproject.org/pkgdb/packages/mingw%2A/ - openSUSE https://build.opensuse.org/project/show/windows:mingw:win32 - Arch https://aur.archlinux.org/packages/?K=mingw and others have projects for providing many hundreds of open-source packages compiled for Windows. Debian has the cross-compilers available but not many packages yet (https://packages.debian.org/search?keywords=mingw). As a developer of a (compiled) open-source library or application, wouldn't you love to be able to build binaries on Linux for Windows? With some work and build system patches, you can. For many projects it's a simple matter of ./configure --host=x86_64-w64-mingw32. Not with CPython though. CPython is only included in 2 of the above MinGW-w64 distribution projects, MSYS2 and Arch. This is possible with a very, very long set of patches, many of which have been submitted by Roumen Petrov to the Python bug tracker - see http://bugs.python.org/issue17605 and other issues linked therein. Roumen has done a huge amount of work, and he and others who consider the MinGW-w64 compiler important will continue to do so. (Thanks a million Roumen!)
I could step in as maintainer for Cygwin and builds based on GCC using mingw* APIs.
Regards, Roumen Petrov
A maintainer has volunteered. Others will help. Can any core developers please begin reviewing some of his patches? Even if just to say they need to be rebased. The lack of responses on the bug tracker is disheartening from an outside perspective. The pile of patches accumulating in external MinGW packaging projects is tantamount to a fork of CPython. It won't go away since there are dedicated packagers working to keep their MinGW-w64 builds functional, even in the ad-hoc current state. The patches continue piling up, making it more difficult for everyone - except for the core Python developers if they continue to ignore the problem. Bring the people working on these patches into the fold as contributors. Review the patches. It would make Python as a language and a community even more diverse and welcoming.
Deprecate/remove support for compiling CPython itself with compilers other than MSVC on Windows
I'm not alone in thinking that this would be a bad idea. MSVC can continue to be the default compiler used for Python on Windows, none of Roumen's patches change that. They would merely open up the choice for packagers and users to build CPython (and extension modules, thanks to separate patches) with alternate compilers, in cross-compilation or otherwise. Sincerely, Tony

(Apologies for the short reply, posting from my phone.) "MSVC can continue to be the default compiler used for Python on Windows, none of Roumen's patches change that. They would merely open up the choice for packagers and users to build CPython (and extension modules, thanks to separate patches) with alternate compilers, in cross-compilation or otherwise." Building CPython for Windows is not something that needs solving. The culture on Windows is to redistribute binaries, not source, and both the core team and a number of redistributors have this figured out (and it will only become easier with VC14 and Python 3.5). I'd rather see this effort thrown behind compiling extensions, including cross compilation. The ABI is well defined enough that any compiler should be usable, especially once the new CRT is in use. However, there is work needed to update the various tool chains to link to VC14's CRT and we need to figure out the inconsistencies between tools so we can document and work through them. Having different builds of CPython out there will only fragment the community and hurt extension authors far more than it may seem to help. Cheers, Steve Top-posted from my Windows Phone ________________________________ From: Tony Kelman<mailto:kelman@berkeley.edu> Sent: 10/25/2014 9:06 To: python-dev@python.org<mailto:python-dev@python.org> Subject: Re: [Python-Dev] Status of C compilers for Python on Windows I'm several weeks late to this discussion, but I'm glad to see that it happened. I'm not a Python developer, and barely a user, but I have several years of daily experience compiling complicated scientific software cross- platform, particularly with MinGW-w64 for Windows. The Python community, both core language and scientific package developers and users, needs to act here. The problem is bad and getting worse. Luckily much of the work to start solving it has already been done in bits and pieces, it needs coordination and participation to come to a conclusion.
Cross compilation is a valid issue, but I hope that build services like Appveyor also help out here. There is regular talk about the PSF/PyPI providing something similar
AppVeyor is better than nothing (I've been using it since beta), but it's a far cry from build services and package management as the Linux world knows them. Obtaining and setting up build dependencies quickly and repeatably, and finishing the build of a complicated piece of software such as CPython, or NumPy, SciPy, Julia (where most of my recent expertise lies), etc. on a small single-core VM with limited memory and a restrictive time limit is often not possible. These problems are solved within Linux infrastructure like Koji, Open Build Service, buildd, etc. MinGW-w64 is a mature, well-tested toolchain that is very capable of cross- compiling a wide variety of libraries from Linux to Windows, in addition to building conventionally on Windows for Windows. The MSYS2 collection of MinGW-w64-compiled packages (https://github.com/Alexpux/MINGW-packages) has been mentioned. Linux distributions including - Fedora https://admin.fedoraproject.org/pkgdb/packages/mingw%2A/ - openSUSE https://build.opensuse.org/project/show/windows:mingw:win32 - Arch https://aur.archlinux.org/packages/?K=mingw and others have projects for providing many hundreds of open-source packages compiled for Windows. Debian has the cross-compilers available but not many packages yet (https://packages.debian.org/search?keywords=mingw). As a developer of a (compiled) open-source library or application, wouldn't you love to be able to build binaries on Linux for Windows? With some work and build system patches, you can. For many projects it's a simple matter of ./configure --host=x86_64-w64-mingw32. Not with CPython though. CPython is only included in 2 of the above MinGW-w64 distribution projects, MSYS2 and Arch. This is possible with a very, very long set of patches, many of which have been submitted by Roumen Petrov to the Python bug tracker - see http://bugs.python.org/issue17605 and other issues linked therein. Roumen has done a huge amount of work, and he and others who consider the MinGW-w64 compiler important will continue to do so. (Thanks a million Roumen!)
I could step in as maintainer for Cygwin and builds based on GCC using mingw* APIs.
Regards, Roumen Petrov
A maintainer has volunteered. Others will help. Can any core developers please begin reviewing some of his patches? Even if just to say they need to be rebased. The lack of responses on the bug tracker is disheartening from an outside perspective. The pile of patches accumulating in external MinGW packaging projects is tantamount to a fork of CPython. It won't go away since there are dedicated packagers working to keep their MinGW-w64 builds functional, even in the ad-hoc current state. The patches continue piling up, making it more difficult for everyone - except for the core Python developers if they continue to ignore the problem. Bring the people working on these patches into the fold as contributors. Review the patches. It would make Python as a language and a community even more diverse and welcoming.
Deprecate/remove support for compiling CPython itself with compilers other than MSVC on Windows
I'm not alone in thinking that this would be a bad idea. MSVC can continue to be the default compiler used for Python on Windows, none of Roumen's patches change that. They would merely open up the choice for packagers and users to build CPython (and extension modules, thanks to separate patches) with alternate compilers, in cross-compilation or otherwise. Sincerely, Tony _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40microsoft.c...

On Sat, Oct 25, 2014 at 6:13 PM, Steve Dower <Steve.Dower@microsoft.com> wrote:
(Apologies for the short reply, posting from my phone.)
"MSVC can continue to be the default compiler used for Python on Windows, none of Roumen's patches change that. They would merely open up the choice for packagers and users to build CPython (and extension modules, thanks to separate patches) with alternate compilers, in cross-compilation or otherwise."
Building CPython for Windows is not something that needs solving. The culture on Windows is to redistribute binaries, not source, and both the core team and a number of redistributors have this figured out (and it will only become easier with VC14 and Python 3.5).
This is the second time you've used the vacuous "culture on Windows" argument, now with an added appeal to (vague) authority. That may be your opinion and that of some others, but there's a large number of people who don't care for using non-Free tools. IMHO building CPython on Windows using Open Source toolchains is very much something that needs merging upstream and supporting by default. What is it that you are afraid of if CPython can be compiled out of the box using mingw/MinGW-w64? Why are you fighting so hard against having option. If CPython wants to truly call itself an Open Source project then I consider being able to compile and cross-compile it with capable Open Source toolchains on all major platforms a requirement.
I'd rather see this effort thrown behind compiling extensions, including cross compilation. The ABI is well defined enough that any compiler should be usable, especially once the new CRT is in use. However, there is work needed to update the various tool chains to link to VC14's CRT and we need to figure out the inconsistencies between tools so we can document and work through them.
Having different builds of CPython out there will only fragment the community and hurt extension authors far more than it may seem to help.
Cheers, Steve
Top-posted from my Windows Phone ________________________________ From: Tony Kelman Sent: 10/25/2014 9:06 To: python-dev@python.org Subject: Re: [Python-Dev] Status of C compilers for Python on Windows
I'm several weeks late to this discussion, but I'm glad to see that it happened. I'm not a Python developer, and barely a user, but I have several years of daily experience compiling complicated scientific software cross- platform, particularly with MinGW-w64 for Windows. The Python community, both core language and scientific package developers and users, needs to act here. The problem is bad and getting worse. Luckily much of the work to start solving it has already been done in bits and pieces, it needs coordination and participation to come to a conclusion.
Cross compilation is a valid issue, but I hope that build services like Appveyor also help out here. There is regular talk about the PSF/PyPI providing something similar
AppVeyor is better than nothing (I've been using it since beta), but it's a far cry from build services and package management as the Linux world knows them. Obtaining and setting up build dependencies quickly and repeatably, and finishing the build of a complicated piece of software such as CPython, or NumPy, SciPy, Julia (where most of my recent expertise lies), etc. on a small single-core VM with limited memory and a restrictive time limit is often not possible. These problems are solved within Linux infrastructure like Koji, Open Build Service, buildd, etc.
MinGW-w64 is a mature, well-tested toolchain that is very capable of cross- compiling a wide variety of libraries from Linux to Windows, in addition to building conventionally on Windows for Windows. The MSYS2 collection of MinGW-w64-compiled packages (https://github.com/Alexpux/MINGW-packages) has been mentioned. Linux distributions including - Fedora https://admin.fedoraproject.org/pkgdb/packages/mingw%2A/ - openSUSE https://build.opensuse.org/project/show/windows:mingw:win32 - Arch https://aur.archlinux.org/packages/?K=mingw and others have projects for providing many hundreds of open-source packages compiled for Windows. Debian has the cross-compilers available but not many packages yet (https://packages.debian.org/search?keywords=mingw).
As a developer of a (compiled) open-source library or application, wouldn't you love to be able to build binaries on Linux for Windows? With some work and build system patches, you can. For many projects it's a simple matter of ./configure --host=x86_64-w64-mingw32. Not with CPython though. CPython is only included in 2 of the above MinGW-w64 distribution projects, MSYS2 and Arch. This is possible with a very, very long set of patches, many of which have been submitted by Roumen Petrov to the Python bug tracker - see http://bugs.python.org/issue17605 and other issues linked therein. Roumen has done a huge amount of work, and he and others who consider the MinGW-w64 compiler important will continue to do so. (Thanks a million Roumen!)
I could step in as maintainer for Cygwin and builds based on GCC using mingw* APIs.
Regards, Roumen Petrov
A maintainer has volunteered. Others will help. Can any core developers please begin reviewing some of his patches? Even if just to say they need to be rebased. The lack of responses on the bug tracker is disheartening from an outside perspective. The pile of patches accumulating in external MinGW packaging projects is tantamount to a fork of CPython. It won't go away since there are dedicated packagers working to keep their MinGW-w64 builds functional, even in the ad-hoc current state. The patches continue piling up, making it more difficult for everyone - except for the core Python developers if they continue to ignore the problem. Bring the people working on these patches into the fold as contributors. Review the patches. It would make Python as a language and a community even more diverse and welcoming.
Deprecate/remove support for compiling CPython itself with compilers other than MSVC on Windows
I'm not alone in thinking that this would be a bad idea. MSVC can continue to be the default compiler used for Python on Windows, none of Roumen's patches change that. They would merely open up the choice for packagers and users to build CPython (and extension modules, thanks to separate patches) with alternate compilers, in cross-compilation or otherwise.
Sincerely, Tony
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/steve.dower%40microsoft.c...
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/mingw.android%40gmail.com

Ray Donnelly wrote:
What is it that you are afraid of if CPython can be compiled out of the box using mingw/MinGW-w64? Why are you fighting so hard against having option.
I'm afraid of users having numpy crash because they're using an MSVC CPython instead of a mingw CPython. I'm afraid of users not being able to use library A and library B at the same time because A requires MSVC CPython and B requires mingw CPython. (I can produce more examples if you like, but the general concern is having a fragmented community, as I said in my previous post.) I'm fighting against "having options" because it will suck up the precious volunteer time we have and direct it away from where it would be more useful, which is making it easier to build extensions with other compilers. I would love to see extensions for Windows built on all platforms. I see no value in building Python itself for Windows from different platforms. If other core developers agree with you that a more "pure" build of Python is worthwhile, then they can go ahead and merge the patches (though I suspect most would like to see some traction achieved on a fork first). I think it's important that I (as Windows build manager) make my own position clear, that's all. (The rest of your email is purely unsubstantiated opinion, which is okay to have, but it doesn't demand any reply so I've omitted it.) Cheers, Steve

On Sun, Oct 26, 2014 at 7:50 AM, Steve Dower <Steve.Dower@microsoft.com> wrote:
Ray Donnelly wrote:
What is it that you are afraid of if CPython can be compiled out of the box using mingw/MinGW-w64? Why are you fighting so hard against having option.
I'm afraid of users having numpy crash because they're using an MSVC CPython instead of a mingw CPython. I'm afraid of users not being able to use library A and library B at the same time because A requires MSVC CPython and B requires mingw CPython. (I can produce more examples if you like, but the general concern is having a fragmented community, as I said in my previous post.)
It might fragment the community to have multiple different binary distributions. But it ought to be possible for any person/organization to say "We're going to make our own build of Python, with these extension modules, built with this compiler, targeting this platform", and do everything from source. That might mean they can no longer take the short-cut of "download someone's MSVC-built extension and use it as-is", but they should be able to take anyone's extension and build it on their chosen compiler. Having MinGW as a formally supported platform would make life a lot easier for people who want to test CPython patches, for instance - my building and testing of PEP 463-enhanced Python was Linux-only, because I didn't want to try to set up an entire new buildchain just to try to get a Windows binary going. There's absolutely no need for that to be binary-compatible with anything else; as long as it'll run the standard library, it'll do. ChrisA

On Sun, 26 Oct 2014 08:11:39 +1100 Chris Angelico <rosuav@gmail.com> wrote:
It might fragment the community to have multiple different binary distributions. But it ought to be possible for any person/organization to say "We're going to make our own build of Python, with these extension modules, built with this compiler, targeting this platform", and do everything from source. That might mean they can no longer take the short-cut of "download someone's MSVC-built extension and use it as-is", but they should be able to take anyone's extension and build it on their chosen compiler. Having MinGW as a formally supported platform would make life a lot easier for people who want to test CPython patches, for instance - my building and testing of PEP 463-enhanced Python was Linux-only,
And how do you know that it would have worked with MSVC if you only use MinGW? If you want to ensure compatibility with MSVC, you must build with MSVC. There's no working around that. Regards Antoine.

On Sun, Oct 26, 2014 at 8:47 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
And how do you know that it would have worked with MSVC if you only use MinGW? If you want to ensure compatibility with MSVC, you must build with MSVC. There's no working around that.
Precisely. If you build with MinGW, you can't ensure compatibility with MSVC. Reread my post: I gave two examples of situations where that isn't a problem. ChrisA

On Sun, 26 Oct 2014 08:53:29 +1100 Chris Angelico <rosuav@gmail.com> wrote:
On Sun, Oct 26, 2014 at 8:47 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
And how do you know that it would have worked with MSVC if you only use MinGW? If you want to ensure compatibility with MSVC, you must build with MSVC. There's no working around that.
Precisely. If you build with MinGW, you can't ensure compatibility with MSVC. Reread my post: I gave two examples of situations where that isn't a problem.
How do you know this isn't a problem, since you haven't *tested* with MSVC? Why on Earth would you want to test your PEP work with an unsupported Windows compiler and runtime, rather than with the officially supported compiler and runtime? Regards Antoine.

On Sun, Oct 26, 2014 at 8:59 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
How do you know this isn't a problem, since you haven't *tested* with MSVC? Why on Earth would you want to test your PEP work with an unsupported Windows compiler and runtime, rather than with the officially supported compiler and runtime?
This discussion revolved around supporting MinGW in addition to MSVC. If it had been supported when I was doing that, I could have spun myself up a Windows build and tested it. Since it was (and so far still is) not, the hassle of hunting down a valid MSVC that could build for Win XP (as that's what my test box runs) was simply not worthwhile. My point is that there is no community fragmentation happening here; the only fragmentation is of binary distribution of extension modules, and there are several ways in which this needn't be a problem. ChrisA

On Sun, 26 Oct 2014 09:06:36 +1100 Chris Angelico <rosuav@gmail.com> wrote:
On Sun, Oct 26, 2014 at 8:59 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
How do you know this isn't a problem, since you haven't *tested* with MSVC? Why on Earth would you want to test your PEP work with an unsupported Windows compiler and runtime, rather than with the officially supported compiler and runtime?
This discussion revolved around supporting MinGW in addition to MSVC. If it had been supported when I was doing that, I could have spun myself up a Windows build and tested it.
My point is that your "Windows build" would not have the same behaviour as a MSVC-produced Windows build, and so testing it with it would not certify that your code would actually be compatible with genuine MSVC builds of CPython, which we will not stop supporting. Therefore, what you and the OP are proposing would not make it *easier* to ensure cross-platform compatibility but rather *harder*, by adding another incompatible build configuration to the mix of supported configurations. The only remaining question is whether it is worthwhile adding support for such an additional platform, and given that MinGW is extremely marginal amongst Windows developers, the answer is IMHO no. Regards Antoine.

On Sun, Oct 26, 2014 at 9:19 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
My point is that your "Windows build" would not have the same behaviour as a MSVC-produced Windows build, and so testing it with it would not certify that your code would actually be compatible with genuine MSVC builds of CPython, which we will not stop supporting.
So you're saying it's impossible to support two compilers? ChrisA

On Sun, 26 Oct 2014 09:22:18 +1100 Chris Angelico <rosuav@gmail.com> wrote:
On Sun, Oct 26, 2014 at 9:19 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
My point is that your "Windows build" would not have the same behaviour as a MSVC-produced Windows build, and so testing it with it would not certify that your code would actually be compatible with genuine MSVC builds of CPython, which we will not stop supporting.
So you're saying it's impossible to support two compilers?
???

On 25 October 2014 23:22, Chris Angelico <rosuav@gmail.com> wrote:
On Sun, Oct 26, 2014 at 9:19 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
My point is that your "Windows build" would not have the same behaviour as a MSVC-produced Windows build, and so testing it with it would not certify that your code would actually be compatible with genuine MSVC builds of CPython, which we will not stop supporting.
So you're saying it's impossible to support two compilers?
No, rather that Windows currently only has a single supported compiler (excluding cygwin, which is essentially a different OS). Adding a second compiler doesn't just involve adding support for it - which is all that the people offering mingw patches are doing - but also involves going through the whole CPython ecosystem locating the places where there is an implicit assumption that "all Windows builds use the same compiler" and fixing them. I've already pointed out where this is a question for pip and wheel. Whoever wants to add support for a second compiler needs to be willing to do this part of the job as well. Handwaving arguments that "it's binary compatible" aren't enough. Prove it. Paul

On Sat, Oct 25, 2014 at 11:44 PM, Paul Moore <p.f.moore@gmail.com> wrote:
On 25 October 2014 23:22, Chris Angelico <rosuav@gmail.com> wrote:
On Sun, Oct 26, 2014 at 9:19 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
My point is that your "Windows build" would not have the same behaviour as a MSVC-produced Windows build, and so testing it with it would not certify that your code would actually be compatible with genuine MSVC builds of CPython, which we will not stop supporting.
So you're saying it's impossible to support two compilers?
No, rather that Windows currently only has a single supported compiler (excluding cygwin, which is essentially a different OS). Adding a second compiler doesn't just involve adding support for it - which is all that the people offering mingw patches are doing - but also involves going through the whole CPython ecosystem locating the places where there is an implicit assumption that "all Windows builds use the same compiler" and fixing them. I've already pointed out where this is a question for pip and wheel. Whoever wants to add support for a second compiler needs to be willing to do this part of the job as well.
Handwaving arguments that "it's binary compatible" aren't enough. Prove it.
The msvcrt.dlls that MinGW-w64 depends on are those dating back to Windows XP SP3 / XP64. Ironically, the official Windows CPython doesn't come with any such crt guarantees and you must ensure that the same msvcr??.dll is used for *all* extensions. This puts considerable strain on extension developers to use the correct (or any) version of Visual Studio to build their extensions for CPython on Windows. Also, where are the publicly accessible specifications and other technical descriptions that MinGW-w64 would need to implement strong binary compatibility with MSVC? As a random example, those for C++ name mangling and the PDB file format would be very helpful.
Paul _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/mingw.android%40gmail.com

Ray Donnelly wrote:
On Sat, Oct 25, 2014 at 11:44 PM, Paul Moore <p.f.moore@gmail.com> wrote:
On 25 October 2014 23:22, Chris Angelico <rosuav@gmail.com> wrote:
On Sun, Oct 26, 2014 at 9:19 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
My point is that your "Windows build" would not have the same behaviour as a MSVC-produced Windows build, and so testing it with it would not certify that your code would actually be compatible with genuine MSVC builds of CPython, which we will not stop supporting.
So you're saying it's impossible to support two compilers?
No, rather that Windows currently only has a single supported compiler (excluding cygwin, which is essentially a different OS). Adding a second compiler doesn't just involve adding support for it - which is all that the people offering mingw patches are doing - but also involves going through the whole CPython ecosystem locating the places where there is an implicit assumption that "all Windows builds use the same compiler" and fixing them. I've already pointed out where this is a question for pip and wheel. Whoever wants to add support for a second compiler needs to be willing to do this part of the job as well.
Handwaving arguments that "it's binary compatible" aren't enough. Prove it.
The msvcrt.dlls that MinGW-w64 depends on are those dating back to Windows XP SP3 / XP64. Ironically, the official Windows CPython doesn't come with any such crt guarantees and you must ensure that the same msvcr??.dll is used for *all* extensions. This puts considerable strain on extension developers to use the correct (or any) version of Visual Studio to build their extensions for CPython on Windows.
We're well aware of this, and it's a big part of why I'm currently migrating CPython to build with VC14, which will not have the same binary compatibility issues. For VC14, the entire CRT has been cleaned up and mostly hidden behind calls into DLLs, so provided the calling conventions match (which they must or everything would crash very quickly), it should be relatively easy to build compatible extensions with MinGW-w64.
Also, where are the publicly accessible specifications and other technical descriptions that MinGW-w64 would need to implement strong binary compatibility with MSVC? As a random example, those for C++ name mangling and the PDB file format would be very helpful.
C++ name mangling is always an implementation detail and it changes from version to version. Luckily, CPython is entirely in C, so that doesn't matter. PDBs are another red herring - you can build a loadable PE file without PDBs. The full source code for the MSVCRT is available with any version of Visual Studio (including the free editions, last time I checked), so feel free to check whatever you need to ensure compatibility. I've suggested to the VC team that they could get in touch with the MinGW projects and offer to help them improve compatibility with MSVC, but unfortunately I don't think anyone will take me up on that. I'm happy to research what I can to answer specific questions, but there's very little that isn't already publicly available other than direct access to the devs. Cheers, Steve
Paul

On Sun, Oct 26, 2014 at 1:45 AM, Steve Dower <Steve.Dower@microsoft.com> wrote:
Ray Donnelly wrote:
On Sat, Oct 25, 2014 at 11:44 PM, Paul Moore <p.f.moore@gmail.com> wrote:
On 25 October 2014 23:22, Chris Angelico <rosuav@gmail.com> wrote:
On Sun, Oct 26, 2014 at 9:19 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
My point is that your "Windows build" would not have the same behaviour as a MSVC-produced Windows build, and so testing it with it would not certify that your code would actually be compatible with genuine MSVC builds of CPython, which we will not stop supporting.
So you're saying it's impossible to support two compilers?
No, rather that Windows currently only has a single supported compiler (excluding cygwin, which is essentially a different OS). Adding a second compiler doesn't just involve adding support for it - which is all that the people offering mingw patches are doing - but also involves going through the whole CPython ecosystem locating the places where there is an implicit assumption that "all Windows builds use the same compiler" and fixing them. I've already pointed out where this is a question for pip and wheel. Whoever wants to add support for a second compiler needs to be willing to do this part of the job as well.
Handwaving arguments that "it's binary compatible" aren't enough. Prove it.
The msvcrt.dlls that MinGW-w64 depends on are those dating back to Windows XP SP3 / XP64. Ironically, the official Windows CPython doesn't come with any such crt guarantees and you must ensure that the same msvcr??.dll is used for *all* extensions. This puts considerable strain on extension developers to use the correct (or any) version of Visual Studio to build their extensions for CPython on Windows.
We're well aware of this, and it's a big part of why I'm currently migrating CPython to build with VC14, which will not have the same binary compatibility issues. For VC14, the entire CRT has been cleaned up and mostly hidden behind calls into DLLs, so provided the calling conventions match (which they must or everything would crash very quickly), it should be relatively easy to build compatible extensions with MinGW-w64.
Compatibility going forwards though, right? Still it's great to see positive steps being made for the future of the Windows platform.
Also, where are the publicly accessible specifications and other technical descriptions that MinGW-w64 would need to implement strong binary compatibility with MSVC? As a random example, those for C++ name mangling and the PDB file format would be very helpful.
C++ name mangling is always an implementation detail and it changes from version to version. Luckily, CPython is entirely in C, so that doesn't matter. PDBs are another red herring - you can build a loadable PE file without PDBs.
Of course C++ can be called from C and that is done in some CPython extensions, so it's not a red herring. If we want to talk about strong binary compatibility I'd expect the aim would be to intermix freely between compilers. We'd like people to be able to debug MinGW-w64 code using CDB in Visual Studio if they want to, and on the flipside, to have GDB able to read PDB files built by MSVC (actually there's a long standing problem when debugging MinGW-w64 code in GDB that stack unwinding out of MS built dlls is flaky at best) - so again this is not really a red herring. I'm also led to believe that MSVC has a very good optimizer so if some project wanted to build certain libraries or objects with that for their performance critical paths then I can see that as being useful to those projects and their users'.
The full source code for the MSVCRT is available with any version of Visual Studio (including the free editions, last time I checked), so feel free to check whatever you need to ensure compatibility. I've suggested to the VC team that they could get in touch with the MinGW projects and offer to help them improve compatibility with MSVC, but unfortunately I don't think anyone will take me up on that. I'm happy to research what I can to answer specific questions, but there's very little that isn't already publicly available other than direct access to the devs.
Under what license? We'd rather have open specifications that copyrighted, strictly licensed code that we can't look at for various tainting reasons.
Cheers, Steve
Paul

Ray Donnelly wrote:
On Sun, Oct 26, 2014 at 1:45 AM, Steve Dower <Steve.Dower@microsoft.com> wrote:
Ray Donnelly wrote:
Also, where are the publicly accessible specifications and other technical descriptions that MinGW-w64 would need to implement strong binary compatibility with MSVC? As a random example, those for C++ name mangling and the PDB file format would be very helpful.
C++ name mangling is always an implementation detail and it changes from version to version. Luckily, CPython is entirely in C, so that doesn't matter. PDBs are another red herring - you can build a loadable PE file without PDBs.
Of course C++ can be called from C and that is done in some CPython extensions, so it's not a red herring. If we want to talk about strong binary compatibility I'd expect the aim would be to intermix freely between compilers. We'd like people to be able to debug MinGW-w64 code using CDB in Visual Studio if they want to, and on the flipside, to have GDB able to read PDB files built by MSVC (actually there's a long standing problem when debugging MinGW-w64 code in GDB that stack unwinding out of MS built dlls is flaky at best) - so again this is not really a red herring. I'm also led to believe that MSVC has a very good optimizer so if some project wanted to build certain libraries or objects with that for their performance critical paths then I can see that as being useful to those projects and their users'.
Binary compatibility that strong is very unlikely to ever happen, and certainly not with versions of compilers that are being actively developed. It would be far too restrictive to both development teams. The weaker compatibility of C DLL boundaries is far more achievable - we already mostly have it, as evidenced by some Python packages working correctly with mismatched compilers. Soon the CRT will be isolated along the same boundaries, which is short-term pain for long-term gain.
The full source code for the MSVCRT is available with any version of Visual Studio (including the free editions, last time I checked), so feel free to
check
whatever you need to ensure compatibility. I've suggested to the VC team that they could get in touch with the MinGW projects and offer to help them improve compatibility with MSVC, but unfortunately I don't think anyone will take me up on that. I'm happy to research what I can to answer specific questions, but there's very little that isn't already publicly available other than direct access to the devs.
Under what license? We'd rather have open specifications that copyrighted, strictly licensed code that we can't look at for various tainting reasons.
As far as I can tell, it's covered by the Visual Studio license, which basically means you can't redistribute the files (I'm not a lawyer, but I've spent plenty of time talking about licenses to lawyers... not sure how much that counts for :) ). Most closed-source Microsoft code is not released under open-source-like licenses, so there's no concept of derivative work, attribution or reciprocation, and that's what appears to cover the CRT sources. "Using" the sources probably counts as using VS, which may trigger some non-commercial clauses if you've got the free version (but probably not the 30 day trial of the paid version... licenses are weird), but reading them is well within the granted permissions. The intention of including the sources is to help people with debugging... I don't think it's even possible to rebuild the CRT from them. I do understand the taint concerns though - until recently, I was operating under rules that made even some documentation "unsafe"...
Cheers, Steve
Paul

On Sun, 26 Oct 2014 00:19:44 +0200, Antoine Pitrou <solipsis@pitrou.net> wrote:
On Sun, 26 Oct 2014 09:06:36 +1100 Chris Angelico <rosuav@gmail.com> wrote:
On Sun, Oct 26, 2014 at 8:59 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
How do you know this isn't a problem, since you haven't *tested* with MSVC? Why on Earth would you want to test your PEP work with an unsupported Windows compiler and runtime, rather than with the officially supported compiler and runtime?
This discussion revolved around supporting MinGW in addition to MSVC. If it had been supported when I was doing that, I could have spun myself up a Windows build and tested it.
My point is that your "Windows build" would not have the same behaviour as a MSVC-produced Windows build, and so testing it with it would not certify that your code would actually be compatible with genuine MSVC builds of CPython, which we will not stop supporting.
While true, I don't think that matters for Chris' point. Given only the ability to build with the MSVC toolchain, his code (which might even be pure python for the purposes of this discussion) would not get tested on Windows until committed and run by the buildbots. If he could build CPython using MinGW, he would, and would test his code on Windows. Even if there are C components and MSVC/MinGW compatibility issues are revealed when the buildbots eventually run the code, still the number of bugs present would probably be lower if he had tested it on Windows than if he hadn't. I know I for one do not generally test patches on Windows because I haven't taken the time to learn how to build CPython on it. Sure, I could test pure python changes by applying patches to an installed Python, but that's an ongoing pain and I'd rather learn to build CPython on Windows and get to use the normal hg tools. If I could use a more linux-like toolchain to build CPython on windows, I would doubtless do much more testing on windows for stuff where I think windows might behave differently (and I might look at more Windows bugs...though frankly there are plenty of bugs for me to look at without looking at Windows bugs). This is not necessarily a compelling argument for MinGW support. However, it *is* a valid argument, IMO. Note: it can be made even less compelling by making it a lot easier to build CPython on Windows without having an MSVC license (which I think means not using the GUI, for which I say *yay* :). I think Zach Ware has been working on improving the Windows build process, and I keep meaning to give it a try... --David

On Sat, 25 Oct 2014 19:24:38 -0400 "R. David Murray" <rdmurray@bitdance.com> wrote:
I know I for one do not generally test patches on Windows because I haven't taken the time to learn how to build CPython on it. Sure, I could test pure python changes by applying patches to an installed Python, but that's an ongoing pain and I'd rather learn to build CPython on Windows and get to use the normal hg tools.
If I could use a more linux-like toolchain to build CPython on windows,
Well, I don't know how "linux-like" you want your toolchain, but FTR you should be able to build from the command line by running "Tools\buildbot\build.bat". I doubt the MinGW case can be much simpler :-) Regards Antoine.

On Sun, Oct 26, 2014 at 12:30 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
On Sat, 25 Oct 2014 19:24:38 -0400 "R. David Murray" <rdmurray@bitdance.com> wrote:
I know I for one do not generally test patches on Windows because I haven't taken the time to learn how to build CPython on it. Sure, I could test pure python changes by applying patches to an installed Python, but that's an ongoing pain and I'd rather learn to build CPython on Windows and get to use the normal hg tools.
If I could use a more linux-like toolchain to build CPython on windows,
Well, I don't know how "linux-like" you want your toolchain, but FTR you should be able to build from the command line by running "Tools\buildbot\build.bat".
I doubt the MinGW case can be much simpler :-)
I beg to differ: "Tools\buildbot\build.bat" contains: @rem Used by the buildbot "compile" step. cmd /c Tools\buildbot\external.bat call "%VS100COMNTOOLS%vsvars32.bat" cmd /c Tools\buildbot\clean.bat msbuild PCbuild\pcbuild.sln /p:Configuration=Debug /p:Platform=Win32 ^ this involves purchasing and installing MS Visual Studio (I'm not sure if the Express Edition can be used). Without explanations for what each step is doing (I posted those last week), on MSYS2: Download and run: http://sourceforge.net/projects/msys2/files/Base/x86_64/msys2-x86_64-2014100...
From the MSYS2 shell: pacman -S git base-devel mingw-w64-x86_64-toolchain mingw-w64-i686-toolchain git clone https://github.com/Alexpux/MINGW-packages cd MINGW-packages/mingw-w64-python3 makepkg-mingw -sL --nocheck (remove --nocheck to run the testsuite. Also you could put this into a batch file if you were so inclined.)
To install the newly built packages, from the MSYS2 shell again: pacman -U mingw-w64-*.xz To run them, you should add /mingw64/bin or /mingw32/bin to your PATH (or launch a new shell via mingw32_shell.bat or mingw64_shell.bat) Of course, if you don't want to build it from source you can simply issue: pacman -S mingw-w64-python3 .. all of the above applies equally to mingw-w64-python2.
Regards
Antoine.
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/mingw.android%40gmail.com

On Sat, Oct 25, 2014 at 7:05 PM, Ray Donnelly <mingw.android@gmail.com> wrote:
On Sun, Oct 26, 2014 at 12:30 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
On Sat, 25 Oct 2014 19:24:38 -0400 "R. David Murray" <rdmurray@bitdance.com> wrote:
I know I for one do not generally test patches on Windows because I haven't taken the time to learn how to build CPython on it. Sure, I could test pure python changes by applying patches to an installed Python, but that's an ongoing pain and I'd rather learn to build CPython on Windows and get to use the normal hg tools.
If I could use a more linux-like toolchain to build CPython on windows,
Well, I don't know how "linux-like" you want your toolchain, but FTR you should be able to build from the command line by running "Tools\buildbot\build.bat".
I doubt the MinGW case can be much simpler :-)
I beg to differ:
"Tools\buildbot\build.bat" contains: @rem Used by the buildbot "compile" step. cmd /c Tools\buildbot\external.bat call "%VS100COMNTOOLS%vsvars32.bat" cmd /c Tools\buildbot\clean.bat msbuild PCbuild\pcbuild.sln /p:Configuration=Debug /p:Platform=Win32
^ this involves purchasing and installing MS Visual Studio (I'm not sure if the Express Edition can be used).
The Express Edition is fine for 32-bit builds. PCbuild\readme.txt has full details on which editions are needed for what, and in 3.5 also has a "quick start guide" (absent from older versions due to a rewriting of the batch scripts that I did a while back): 1. Install Microsoft Visual C++ 2010 SP1, any edition. 2. Install Subversion, and make sure 'svn.exe' is on your PATH. 3. Install NASM, and make sure 'nasm.exe' is on your PATH. 4. Run "build.bat -e" to build Python in 32-bit Release configuration. 5. (Optional, but recommended) Run the test suite with "rt.bat -q". And really, you can skip step 5 if you don't want to run the tests; you can skip step 3 if you don't want/need ssl; and you can skip step 2 if you don't want/need bz2, lzma, sqlite3, ssl, or tkinter. Skipping steps 2 or 3 will cause a lot of angry red text in your build output, but I hope to solve that problem eventually.
Without explanations for what each step is doing (I posted those last week), on MSYS2: Download and run: http://sourceforge.net/projects/msys2/files/Base/x86_64/msys2-x86_64-2014100... From the MSYS2 shell: pacman -S git base-devel mingw-w64-x86_64-toolchain mingw-w64-i686-toolchain git clone https://github.com/Alexpux/MINGW-packages cd MINGW-packages/mingw-w64-python3 makepkg-mingw -sL --nocheck (remove --nocheck to run the testsuite. Also you could put this into a batch file if you were so inclined.)
To install the newly built packages, from the MSYS2 shell again: pacman -U mingw-w64-*.xz
To run them, you should add /mingw64/bin or /mingw32/bin to your PATH (or launch a new shell via mingw32_shell.bat or mingw64_shell.bat) Of course, if you don't want to build it from source you can simply issue: pacman -S mingw-w64-python3
.. all of the above applies equally to mingw-w64-python2.
I'm failing to see where that's simpler :) For the record, on the issue at hand, I agree that any effort should go toward making it possible to build extensions without MSVC. Once that problem has been completely solved, then we can consider (long and hard) building Python itself with something else. I personally have not been convinced that there's any good reason to do so, but I'm not unconvincible :) -- Zach

On 26 October 2014 01:05, Ray Donnelly <mingw.android@gmail.com> wrote:
Download and run: http://sourceforge.net/projects/msys2/files/Base/x86_64/msys2-x86_64-2014100...
Sending this offline because I really don't want to start up another extended debate, but is there a version of this that I can use that

Please ignore this. I hit the wrong button. On 27 October 2014 14:18, Paul Moore <p.f.moore@gmail.com> wrote:
On 26 October 2014 01:05, Ray Donnelly <mingw.android@gmail.com> wrote:
Download and run: http://sourceforge.net/projects/msys2/files/Base/x86_64/msys2-x86_64-2014100...
Sending this offline because I really don't want to start up another extended debate, but is there a version of this that I can use that

On 26/10/2014 00:24, R. David Murray wrote:
On Sun, 26 Oct 2014 00:19:44 +0200, Antoine Pitrou <solipsis@pitrou.net> wrote:
On Sun, 26 Oct 2014 09:06:36 +1100 Chris Angelico <rosuav@gmail.com> wrote:
On Sun, Oct 26, 2014 at 8:59 AM, Antoine Pitrou <solipsis@pitrou.net> wrote:
How do you know this isn't a problem, since you haven't *tested* with MSVC? Why on Earth would you want to test your PEP work with an unsupported Windows compiler and runtime, rather than with the officially supported compiler and runtime?
This discussion revolved around supporting MinGW in addition to MSVC. If it had been supported when I was doing that, I could have spun myself up a Windows build and tested it.
My point is that your "Windows build" would not have the same behaviour as a MSVC-produced Windows build, and so testing it with it would not certify that your code would actually be compatible with genuine MSVC builds of CPython, which we will not stop supporting.
While true, I don't think that matters for Chris' point. Given only the ability to build with the MSVC toolchain, his code (which might even be pure python for the purposes of this discussion) would not get tested on Windows until committed and run by the buildbots. If he could build CPython using MinGW, he would, and would test his code on Windows. Even if there are C components and MSVC/MinGW compatibility issues are revealed when the buildbots eventually run the code, still the number of bugs present would probably be lower if he had tested it on Windows than if he hadn't.
I know I for one do not generally test patches on Windows because I haven't taken the time to learn how to build CPython on it. Sure, I could test pure python changes by applying patches to an installed Python, but that's an ongoing pain and I'd rather learn to build CPython on Windows and get to use the normal hg tools.
If I could use a more linux-like toolchain to build CPython on windows, I would doubtless do much more testing on windows for stuff where I think windows might behave differently (and I might look at more Windows bugs...though frankly there are plenty of bugs for me to look at without looking at Windows bugs).
This is not necessarily a compelling argument for MinGW support. However, it *is* a valid argument, IMO.
Note: it can be made even less compelling by making it a lot easier to build CPython on Windows without having an MSVC license (which I think means not using the GUI, for which I say *yay* :). I think Zach Ware has been working on improving the Windows build process, and I keep meaning to give it a try...
--David
MSVC Express Edition 2010 works perfectly for building 3.5 so no license needed. Links to older versions have been pointed out on other threads, either here or python-ideas, maybe both? Or use the command line as Antoine pointed out elsewhere. -- My fellow Pythonistas, ask not what our language can do for you, ask what you can do for our language. Mark Lawrence

On Sat, Oct 25, 2014 at 6:24 PM, R. David Murray <rdmurray@bitdance.com> wrote:
Note: it can be made even less compelling by making it a lot easier to build CPython on Windows without having an MSVC license (which I think means not using the GUI, for which I say *yay* :). I think Zach Ware has been working on improving the Windows build process, and I keep meaning to give it a try...
So far my improvements have been limited to what it takes to build after installing prerequisites (and documenting what exactly those prerequisites are), but on that front things are significantly better in 3.5, I think. I will note that it's been possible to build Python entirely without using the VS GUI (though it still has to be installed, I think) for quite some time, but hasn't been especially easy to remember the incantations to do so. 3.5 now has a fairly nice set of batch scripts (I think; but I (re)wrote them :) that work well together and are even documented in the PCbuild readme. I've had dreams of a set of configure.bat/make.bat scripts (issue16895) to make things even simpler, but I've put that on hold until after Steve Dower's major overhaul for VC14 has happened. One thing I'd like to look into more eventually is seeing what it would take to build with only the Windows SDK installed (which is free and has no GUI at all), but I think Steve has mentioned something about that in connection with the work he's doing -- either way, things will be different when we switch to VC14 so I'm also putting that off. -- Zach

R. David Murray writes:
On Sun, 26 Oct 2014 00:19:44 +0200, Antoine Pitrou <solipsis@pitrou.net> wrote:
My point is that your "Windows build" would not have the same behaviour as a MSVC-produced Windows build, and so testing it with it would not certify that your code would actually be compatible with genuine MSVC builds of CPython, which we will not stop supporting.
While true, I don't think that matters for Chris' point.
[...]
If I could use a more linux-like toolchain to build CPython on windows, I would doubtless do much more testing on windows for stuff where I think windows might behave differently (and I might look at more Windows bugs...though frankly there are plenty of bugs for me to look at without looking at Windows bugs).
This is not necessarily a compelling argument for MinGW support. However, it *is* a valid argument, IMO.
Nobody claims that the there are not arguments, even compelling arguments, for MinGW support (more generally, support for alternative toolchains). But there are *also* compelling arguments for *supporting* *both* those "no need to worry about mixed ABIs" situations and *mixed* situations. And that becomes Python Dev's problem if the patches are added to core Python. Currently, they're somebody else's problem, and that's as it should be at this stage. Python is open source. Nobody is objecting to "somebody else" doing this.[1] The problem here is simply that some "somebody elses" are trying to throw future work over the wall into python-dev space. There is nothing wrong with that, either -- that's why there is a stdlib, for example -- but the python-dev concerns about platform fragmentation are genuine (even if not applicable to all potential users of the alternative toolchains), and substantial resources will be needed to do the testing required to meet python-dev's requirement that such code be *binary* compatible with other binaries downloaded for Windows, as well as for maintenance of the code itself. Footnotes: [1] Some *do* question whether there's a need for anybody to do this, and that's bogus. "I just wanna" is good enough reason to do it. The issue here is that it's not good enough reason for python-dev to do the support and maintenance going forward.

On 10/25/2014 5:11 PM, Chris Angelico wrote:
It might fragment the community to have multiple different binary distributions. But it ought to be possible for any person/organization to say "We're going to make our own build of Python, with these extension modules, built with this compiler, targeting this platform", and do everything from source. That might mean they can no longer take the short-cut of "download someone's MSVC-built extension and use it as-is", but they should be able to take anyone's extension and build it on their chosen compiler. Having MinGW as a formally supported platform would make life a lot easier for people who want to test CPython patches, for instance - my building and testing of PEP 463-enhanced Python was Linux-only, because I didn't want to try to set up an entire new buildchain just to try to get a Windows binary going. There's absolutely no need for that to be binary-compatible with anything else; as long as it'll run the standard library, it'll do.
David Murray's unanswered post laid out the path to move in the direction you want. Either take it yourself or try to persuade other MinGW fans to do so. -- Terry Jan Reedy

On 25 October 2014 21:50, Steve Dower <Steve.Dower@microsoft.com> wrote:
Ray Donnelly wrote:
What is it that you are afraid of if CPython can be compiled out of the box using mingw/MinGW-w64? Why are you fighting so hard against having option.
I'm afraid of users having numpy crash because they're using an MSVC CPython instead of a mingw CPython. I'm afraid of users not being able to use library A and library B at the same time because A requires MSVC CPython and B requires mingw CPython. (I can produce more examples if you like, but the general concern is having a fragmented community, as I said in my previous post.)
Precisely. Either developers test with *all* supported compilers, or there is a risk of incompatibilities. Advocates of supporting mingw as a build system for Python generally do *not* suggest that they are willing to test for, and deal with, cross-version compatibility issues. Typically mingw is seen as "another platform" in some sense, by such advocates, having its own set of supporters and maintainers. The possibility of extensions built with a mingw-compiled Python failing when used under a MSVC-built Python is real. It's difficult to say how big that risk is, but it's certainly there. And I see no-one offering to be responsible for such compatibility issues (the mingw supporters generally don't want to set up a MSVC build chain, so it's difficult to see how they could credibly offer).
I'm fighting against "having options" because it will suck up the precious volunteer time we have and direct it away from where it would be more useful, which is making it easier to build extensions with other compilers.
And claiming that it doesn't suck up developer time ignores the point I made above - *someone* has to deal with any compatibility issues that come up. As a starter, does the wheel format need to include tags to distinguish whether the target Python is MSVC-built and mingw-built? Who will check that? If there is a need, who will work on the code needed to incorporate that change into wheel, pip, and the relevant PEPs? As Steve says, the Python community has a genuine, strong need for people with mingw expertise working on making it easier to build *extensions* using mingw, that work with a MSVC-built CPython. If it were possible to cross-compile compatible extensions on Linux, projects developed on Linux could supply Windows binaries much more easily, which would be a huge benefit to the whole Windows Python community. But the mingw experts don't want to work on that, preferring to develop patches allowing CPython to be built with mingw. No objection from me, it's your free time, use it as you wish, but as a Windows user of Python I can confirm that it's not what I'd like you to be doing as your contribution to Python.
I would love to see extensions for Windows built on all platforms. I see no value in building Python itself for Windows from different platforms.
Exactly.
If other core developers agree with you that a more "pure" build of Python is worthwhile, then they can go ahead and merge the patches (though I suspect most would like to see some traction achieved on a fork first). I think it's important that I (as Windows build manager) make my own position clear, that's all.
Personally, I'm not a core developer, just a long-time member of this list and occasional contributor to discussions. But I am also a core pip developer and a Windows user, and from that perspective I am acutely aware of the common pain points for Python users on Windows. And they are all about binary extensions, and none at all about building Python itself. So in my view, being able to build CPython using mingw is somewhat interesting from a theoretical perspective, but of little or no practical value[1] and disruptive in a number of ways, as mentioned above, to improving the overall experience of Python users on Windows. Paul [1] I note, without trying to make a judgement, that many of the benefits cited for building with mingw are around "being able to use free tools" or similar essentially ideological issues.

On Sat, 25 Oct 2014 21:10:23 +0100 Ray Donnelly <mingw.android@gmail.com> wrote:
This is the second time you've used the vacuous "culture on Windows" argument, now with an added appeal to (vague) authority.
[...]
Why are you fighting so hard against having option. If CPython wants to truly call itself an Open Source project then I consider being able to compile and cross-compile it with capable Open Source toolchains on all major platforms a requirement.
Now *that* sounds vacuous. Regarding open source, there's a clear and official definition of it, which Python satisfies: http://opensource.org/osd-annotated Regards Antoine.

On Sat, Oct 25, 2014 at 10:52 PM, Antoine Pitrou <solipsis@pitrou.net> wrote:
On Sat, 25 Oct 2014 21:10:23 +0100 Ray Donnelly <mingw.android@gmail.com> wrote:
This is the second time you've used the vacuous "culture on Windows" argument, now with an added appeal to (vague) authority.
[...]
Why are you fighting so hard against having option. If CPython wants to truly call itself an Open Source project then I consider being able to compile and cross-compile it with capable Open Source toolchains on all major platforms a requirement.
Now *that* sounds vacuous.
Maybe you missed where I said "I consider".
Regarding open source, there's a clear and official definition of it, which Python satisfies: http://opensource.org/osd-annotated
Regards
Antoine.
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/mingw.android%40gmail.com

On Sat, 25 Oct 2014 21:10:23 +0100, Ray Donnelly <mingw.android@gmail.com> wrote:
On Sat, Oct 25, 2014 at 6:13 PM, Steve Dower <Steve.Dower@microsoft.com> wrote:
(Apologies for the short reply, posting from my phone.)
"MSVC can continue to be the default compiler used for Python on Windows, none of Roumen's patches change that. They would merely open up the choice for packagers and users to build CPython (and extension modules, thanks to separate patches) with alternate compilers, in cross-compilation or otherwise."
Building CPython for Windows is not something that needs solving. The culture on Windows is to redistribute binaries, not source, and both the core team and a number of redistributors have this figured out (and it will only become easier with VC14 and Python 3.5).
This is the second time you've used the vacuous "culture on Windows" argument, now with an added appeal to (vague) authority. That may be your opinion and that of some others, but there's a large number of people who don't care for using non-Free tools. IMHO building CPython on Windows using Open Source toolchains is very much something that needs merging upstream and supporting by default. What is it that you are afraid of if CPython can be compiled out of the box using mingw/MinGW-w64? Why are you fighting so hard against having option. If CPython wants to truly call itself an Open Source project then I consider being able to compile and cross-compile it with capable Open Source toolchains on all major platforms a requirement.
You are doing yourself a disservice by this last statement. There really can't be any question that Python is an open source project, so insinuating that the CPython community is "doing something wrong" is not going to win you friends and helpers. A better approach would be to acknowledge that what we are currently doing works well for supporting Windows (especially since we actually have some engagement from Microsoft that is *getting some problems fixed* in ways that help make things more open). And then say, "wouldn't it be *really cool* if we could also build CPython using an open source toolchain on Windows out of the box?". You might not get instant agreement on that (well, clearly you won't), but it'd be much more likely you'd start garnering support. Assume that people are well intentioned, and convince them your suggestions will make things *even better* using positive arguments. You might not succeed, but you'll have a much better chance. --David

On Sat, Oct 25, 2014 at 1:10 PM, Ray Donnelly <mingw.android@gmail.com> wrote:
On Sat, Oct 25, 2014 at 6:13 PM, Steve Dower <Steve.Dower@microsoft.com> wrote:
Building CPython for Windows is not something that needs solving. The culture on Windows is to redistribute binaries, not source, and both the core team and a number of redistributors have this figured out (and it will only become easier with VC14 and Python 3.5).
This is the second time you've used the vacuous "culture on Windows" argument, now with an added appeal to (vague) authority. That may be your opinion and that of some others, but there's a large number of people who don't care for using non-Free tools. IMHO building CPython on Windows using Open Source toolchains is very much something that needs merging upstream and supporting by default. What is it that you are afraid of if CPython can be compiled out of the box using mingw/MinGW-w64? Why are you fighting so hard against having option. If CPython wants to truly call itself an Open Source project then I consider being able to compile and cross-compile it with capable Open Source toolchains on all major platforms a requirement.
Please stop this ridiculous argument. There's no definition of "truly open source project" that has such a requirement, and if you took it to the extreme you should not be using Windows at all. I appreciate your concern that building Python for your favorite platform using your favorite toolchain doesn't work, and if you have patches (or even bug reports) those are appreciated. But please take your rhetoric about open source elsewhere.
I'd rather see this effort thrown behind compiling extensions, including cross compilation. The ABI is well defined enough that any compiler should be usable, especially once the new CRT is in use. However, there is work needed to update the various tool chains to link to VC14's CRT and we need to figure out the inconsistencies between tools so we can document and work through them.
Having different builds of CPython out there will only fragment the community and hurt extension authors far more than it may seem to help.
Here's the crux of the matter. We want compiled extension modules distributed via PyPI to work with the binaries distributed from python.org. -- --Guido van Rossum (python.org/~guido)

Thanks all for the responses. Clearly this is a subject about which people feel strongly, so that's good at least. David Murray's guidance in particular points to the most likely path to get improvements to really happen. Steve Dower:
Building CPython for Windows is not something that needs solving.
Not in your opinion, but numerous packagers of MinGW-based native or cross-compiled package sets would love to include Python. The fact that they currently can't, without many patches, is a problem.
The culture on Windows is to redistribute binaries, not source,
There are many cultures using Windows. Including open-source ones.
and both the core team and a number of redistributors have this figured out (and it will only become easier with VC14 and Python 3.5).
With MSVC. It doesn't work with MinGW, it likely doesn't work with Clang. MSVC is not the only compiler on Windows. There are many use cases for preferring other compilers. Have you read this wiki page for example? https://github.com/numpy/numpy/wiki/Numerical-software-on-Windows In my personal experience, having recently gotten Julia to compile using MSVC for the first time, MSVC as a compiler is highly deficient for many needs especially in the scientific software community: - C99 (getting better recently, but still not done) - AT&T syntax assembly - C++11 features (also mostly okay now, but not if you're using an older MSVC version with Python 2.7, which many people still have to do) - 128-bit integer intrinsics - cannot cross-compile from anything that isn't Windows - build systems foreign relative to shell/makefile systems used by most open-source projects, few projects have time to maintain 2 separate build systems (cmake helps but takes a lot of initial effort to convert to) - no free-as-in-beer Fortran compiler available I have none of these problems when I use MinGW-w64. Hence the desire to be able to curate an all-MinGW software stack. It's not a matter of open- source ideology for me, it's brass tacks "can I do the work I need to do." With MSVC I can't, with MinGW-w64 I can. Not being able to include CPython in an all-MinGW stack hurts, a lot. Only cross-compilation and the build system in the above list are relevant to CPython, but I hope I have convinced you, Paul Moore, etc. that there are real reasons for some groups of users and developers to prefer MinGW-w64 over MSVC.
I'd rather see this effort thrown behind compiling extensions, including cross compilation.
There are patches awaiting review that improve this as well. Efforts to improve CPython's build system and the handling of extensions are not completely independent, in many cases the patches are written by the same set of MinGW users. One of these sets of patches is not inherently evil, you understandably have less interest in them but it's still disappointing to see so little movement on either.
Having different builds of CPython out there will only fragment the community and hurt extension authors far more than it may seem to help.
The community of people developing and using open-source projects, either CPython or otherwise, is already highly fragmented. Ignoring it makes it worse. python.org does not have to distribute or endorse MinGW-compiled builds of CPython. If the build option never gets incorporated, then it will continue to be reverse-engineered. Guido van Rossum:
Here's the crux of the matter. We want compiled extension modules distributed via PyPI to work with the binaries distributed from python.org.
Absolutely. I don't think additional options in the build system would change this. R. David Murray:
And, at this point, we would NEED A BUILDBOT. That is, a machine that has whatever tools are required installed such that tests added to the test suite to test MinGW support in distutils would run, so we can be sure we don't break anything when making other changes.
That's not too hard. I've done this for other projects. AppVeyor works if your build is short enough, and I've done cross-compilation from Travis CI for other projects. Or Jenkins, or a Vagrant VM. I don't know PSF's infrastructure, but I can offer guidance if it would help. Steve Dower:
I'm afraid of users having numpy crash because they're using an MSVC CPython instead of a mingw CPython. I'm afraid of users not being able to use library A and library B at the same time because A requires MSVC CPython and B requires mingw CPython. (I can produce more examples if you like, but the general concern is having a fragmented community, as I said in my previous post.)
A valid fear. Mixing C runtimes can cause problems, I've seen this myself. Correct me if I'm wrong, but this is nearly as much of an issue if someone wants to use a different version of MSVC to compile CPython than the version used to build the official binaries. It requires care, but you can't deny that there are use cases where people will want and need to do such things. Is possible fragmentation a good enough reason to resist making it possible in the build system?
though I suspect most would like to see some traction achieved on a fork first
Those of us who consider this important should probably just do this. Ray, Roumen, the maintainer of the Arch MinGW packages, myself and others could look into making an actual fork on Github or Bitbucket where we merge the various patches and come up with an out-of-the-box MinGW-[cross-]compilable version of CPython. I'll happily write the spec files to get this building from Fedora or openSUSE. That would help us test the feasibility from a centralized repository. Ray, what do you think? Do you know xantares' email address to ask if he'd be interested in helping or using the result? Zachary Ware:
I'm failing to see where that's simpler :)
If it were hypothetically merged instead of out in an external fork, it could be ./configure --host=x86_64-w64-mingw32 && make to cross-compile from Linux or Cygwin, or just ./configure && make after installing MSYS2 (which is just about as easy as installing MSVC) on Windows. Paul Moore:
If it were possible to cross-compile compatible extensions on Linux, projects developed on Linux could supply Windows binaries much more easily, which would be a huge benefit to the whole Windows Python community.
I want to do exactly this in an automated repeatable way, preferably on a build service. This seems harder to do when CPython cannot itself be built and handled as a dependency by that same automated, repeatable build service. Unless it becomes possible to cross-compile extensions using the build machine's own version of Python, which might be the right approach.
acutely aware of the common pain points for Python users on Windows. And they are all about binary extensions, and none at all about building Python itself.
I've done a lot of recent work keeping Julia working well on Windows, and the interoperability we have with Python packages has propagated most of these pain points to us as well. We have to rely on Conda in order to have a reliable way of installing, as an example, IPython with the notebook interface, in order for IJulia to work. This is not an ideal solution as it requires a great deal of user intervention and manual steps to get up and running (and it would be far worse without Conda). We are, so far, built around MinGW-w64 on Windows, for the reasons I listed above. Having cross- compiled builds of CPython and binary extensions available from the same build services we already use to install other binary packages (Gtk, Cairo, Tk, Nettle, HDF5, etc) on Windows would be enormously helpful for us. There's a real use case. Its size and importance can be debated. For now I'll take David Murray's post to heart and see where I have time or ability to help things along. Sincerely, Tony

On Sun, Oct 26, 2014 at 1:12 PM, Tony Kelman <kelman@berkeley.edu> wrote:
Thanks all for the responses. Clearly this is a subject about which people feel strongly, so that's good at least. David Murray's guidance in particular points to the most likely path to get improvements to really happen.
Steve Dower:
Building CPython for Windows is not something that needs solving.
Not in your opinion, but numerous packagers of MinGW-based native or cross-compiled package sets would love to include Python. The fact that they currently can't, without many patches, is a problem.
The culture on Windows is to redistribute binaries, not source,
There are many cultures using Windows. Including open-source ones.
and both the core team and a number of redistributors have this figured out (and it will only become easier with VC14 and Python 3.5).
With MSVC. It doesn't work with MinGW, it likely doesn't work with Clang. MSVC is not the only compiler on Windows. There are many use cases for preferring other compilers. Have you read this wiki page for example? https://github.com/numpy/numpy/wiki/Numerical-software-on-Windows
In my personal experience, having recently gotten Julia to compile using MSVC for the first time, MSVC as a compiler is highly deficient for many needs especially in the scientific software community: - C99 (getting better recently, but still not done) - AT&T syntax assembly - C++11 features (also mostly okay now, but not if you're using an older MSVC version with Python 2.7, which many people still have to do) - 128-bit integer intrinsics - cannot cross-compile from anything that isn't Windows - build systems foreign relative to shell/makefile systems used by most open-source projects, few projects have time to maintain 2 separate build systems (cmake helps but takes a lot of initial effort to convert to) - no free-as-in-beer Fortran compiler available
I have none of these problems when I use MinGW-w64. Hence the desire to be able to curate an all-MinGW software stack. It's not a matter of open- source ideology for me, it's brass tacks "can I do the work I need to do." With MSVC I can't, with MinGW-w64 I can. Not being able to include CPython in an all-MinGW stack hurts, a lot.
Only cross-compilation and the build system in the above list are relevant to CPython, but I hope I have convinced you, Paul Moore, etc. that there are real reasons for some groups of users and developers to prefer MinGW-w64 over MSVC.
I'd rather see this effort thrown behind compiling extensions, including cross compilation.
There are patches awaiting review that improve this as well. Efforts to improve CPython's build system and the handling of extensions are not completely independent, in many cases the patches are written by the same set of MinGW users. One of these sets of patches is not inherently evil, you understandably have less interest in them but it's still disappointing to see so little movement on either.
Having different builds of CPython out there will only fragment the community and hurt extension authors far more than it may seem to help.
The community of people developing and using open-source projects, either CPython or otherwise, is already highly fragmented. Ignoring it makes it worse. python.org does not have to distribute or endorse MinGW-compiled builds of CPython. If the build option never gets incorporated, then it will continue to be reverse-engineered.
Guido van Rossum:
Here's the crux of the matter. We want compiled extension modules distributed via PyPI to work with the binaries distributed from python.org.
Absolutely. I don't think additional options in the build system would change this.
R. David Murray:
And, at this point, we would NEED A BUILDBOT. That is, a machine that has whatever tools are required installed such that tests added to the test suite to test MinGW support in distutils would run, so we can be sure we don't break anything when making other changes.
That's not too hard. I've done this for other projects. AppVeyor works if your build is short enough, and I've done cross-compilation from Travis CI for other projects. Or Jenkins, or a Vagrant VM. I don't know PSF's infrastructure, but I can offer guidance if it would help.
Steve Dower:
I'm afraid of users having numpy crash because they're using an MSVC CPython instead of a mingw CPython. I'm afraid of users not being able to use library A and library B at the same time because A requires MSVC CPython and B requires mingw CPython. (I can produce more examples if you like, but the general concern is having a fragmented community, as I said in my previous post.)
A valid fear. Mixing C runtimes can cause problems, I've seen this myself. Correct me if I'm wrong, but this is nearly as much of an issue if someone wants to use a different version of MSVC to compile CPython than the version used to build the official binaries. It requires care, but you can't deny that there are use cases where people will want and need to do such things. Is possible fragmentation a good enough reason to resist making it possible in the build system?
though I suspect most would like to see some traction achieved on a fork first
Those of us who consider this important should probably just do this. Ray, Roumen, the maintainer of the Arch MinGW packages, myself and others could look into making an actual fork on Github or Bitbucket where we merge the various patches and come up with an out-of-the-box MinGW-[cross-]compilable version of CPython. I'll happily write the spec files to get this building from Fedora or openSUSE. That would help us test the feasibility from a centralized repository. Ray, what do you think? Do you know xantares' email address to ask if he'd be interested in helping or using the result?
I like this idea. To reduce the workload, we should probably pick Python3 (at least initially)? I have collaborated with xantares on the ArchLinux AUR mingw-w64-python2 package (whom I've bcc'ed). In fact, as of now, our patches are exactly the same except ArchLinux is missing one new patch. Looking at https://aur.archlinux.org/packages/mingw-w64-python2/ it seems xantares handed over maintainer-ship to Dr Shadow. I've left a comment asking for the new maintainer to email me. If we pick Python3 instead of 2 then bringing up an ArchLinux AUR package for that would be my next course of action. Cross-compilation of mingw-w64-python3 will no doubt need some fixes as I've not done it for a while. Ideally, we'd hook this repository up to as complete a CI system as possible and introduce each patch one at a time so that any and every breakage or regression on the currently supported systems gets fixed immediately. Also having reviews from some core Python developers (if we can get motivated supporters from that group) would be immensely helpful. My fear is that without such core involvement, the attempt to upstream the final patch-set would be overwhelming.
Zachary Ware:
I'm failing to see where that's simpler :)
If it were hypothetically merged instead of out in an external fork, it could be ./configure --host=x86_64-w64-mingw32 && make to cross-compile from Linux or Cygwin, or just ./configure && make after installing MSYS2 (which is just about as easy as installing MSVC) on Windows.
Paul Moore:
If it were possible to cross-compile compatible extensions on Linux, projects developed on Linux could supply Windows binaries much more easily, which would be a huge benefit to the whole Windows Python community.
I want to do exactly this in an automated repeatable way, preferably on a build service. This seems harder to do when CPython cannot itself be built and handled as a dependency by that same automated, repeatable build service. Unless it becomes possible to cross-compile extensions using the build machine's own version of Python, which might be the right approach.
acutely aware of the common pain points for Python users on Windows. And they are all about binary extensions, and none at all about building Python itself.
I've done a lot of recent work keeping Julia working well on Windows, and the interoperability we have with Python packages has propagated most of these pain points to us as well. We have to rely on Conda in order to have a reliable way of installing, as an example, IPython with the notebook interface, in order for IJulia to work. This is not an ideal solution as it requires a great deal of user intervention and manual steps to get up and running (and it would be far worse without Conda). We are, so far, built around MinGW-w64 on Windows, for the reasons I listed above. Having cross- compiled builds of CPython and binary extensions available from the same build services we already use to install other binary packages (Gtk, Cairo, Tk, Nettle, HDF5, etc) on Windows would be enormously helpful for us.
There's a real use case. Its size and importance can be debated. For now I'll take David Murray's post to heart and see where I have time or ability to help things along.
Sincerely, Tony

On Sun, Oct 26, 2014 at 2:28 PM, Ray Donnelly <mingw.android@gmail.com> wrote:
On Sun, Oct 26, 2014 at 1:12 PM, Tony Kelman <kelman@berkeley.edu> wrote:
Thanks all for the responses. Clearly this is a subject about which people feel strongly, so that's good at least. David Murray's guidance in particular points to the most likely path to get improvements to really happen.
Steve Dower:
Building CPython for Windows is not something that needs solving.
Not in your opinion, but numerous packagers of MinGW-based native or cross-compiled package sets would love to include Python. The fact that they currently can't, without many patches, is a problem.
The culture on Windows is to redistribute binaries, not source,
There are many cultures using Windows. Including open-source ones.
and both the core team and a number of redistributors have this figured out (and it will only become easier with VC14 and Python 3.5).
With MSVC. It doesn't work with MinGW, it likely doesn't work with Clang. MSVC is not the only compiler on Windows. There are many use cases for preferring other compilers. Have you read this wiki page for example? https://github.com/numpy/numpy/wiki/Numerical-software-on-Windows
In my personal experience, having recently gotten Julia to compile using MSVC for the first time, MSVC as a compiler is highly deficient for many needs especially in the scientific software community: - C99 (getting better recently, but still not done) - AT&T syntax assembly - C++11 features (also mostly okay now, but not if you're using an older MSVC version with Python 2.7, which many people still have to do) - 128-bit integer intrinsics - cannot cross-compile from anything that isn't Windows - build systems foreign relative to shell/makefile systems used by most open-source projects, few projects have time to maintain 2 separate build systems (cmake helps but takes a lot of initial effort to convert to) - no free-as-in-beer Fortran compiler available
I have none of these problems when I use MinGW-w64. Hence the desire to be able to curate an all-MinGW software stack. It's not a matter of open- source ideology for me, it's brass tacks "can I do the work I need to do." With MSVC I can't, with MinGW-w64 I can. Not being able to include CPython in an all-MinGW stack hurts, a lot.
Only cross-compilation and the build system in the above list are relevant to CPython, but I hope I have convinced you, Paul Moore, etc. that there are real reasons for some groups of users and developers to prefer MinGW-w64 over MSVC.
I'd rather see this effort thrown behind compiling extensions, including cross compilation.
There are patches awaiting review that improve this as well. Efforts to improve CPython's build system and the handling of extensions are not completely independent, in many cases the patches are written by the same set of MinGW users. One of these sets of patches is not inherently evil, you understandably have less interest in them but it's still disappointing to see so little movement on either.
Having different builds of CPython out there will only fragment the community and hurt extension authors far more than it may seem to help.
The community of people developing and using open-source projects, either CPython or otherwise, is already highly fragmented. Ignoring it makes it worse. python.org does not have to distribute or endorse MinGW-compiled builds of CPython. If the build option never gets incorporated, then it will continue to be reverse-engineered.
Guido van Rossum:
Here's the crux of the matter. We want compiled extension modules distributed via PyPI to work with the binaries distributed from python.org.
Absolutely. I don't think additional options in the build system would change this.
R. David Murray:
And, at this point, we would NEED A BUILDBOT. That is, a machine that has whatever tools are required installed such that tests added to the test suite to test MinGW support in distutils would run, so we can be sure we don't break anything when making other changes.
That's not too hard. I've done this for other projects. AppVeyor works if your build is short enough, and I've done cross-compilation from Travis CI for other projects. Or Jenkins, or a Vagrant VM. I don't know PSF's infrastructure, but I can offer guidance if it would help.
Steve Dower:
I'm afraid of users having numpy crash because they're using an MSVC CPython instead of a mingw CPython. I'm afraid of users not being able to use library A and library B at the same time because A requires MSVC CPython and B requires mingw CPython. (I can produce more examples if you like, but the general concern is having a fragmented community, as I said in my previous post.)
A valid fear. Mixing C runtimes can cause problems, I've seen this myself. Correct me if I'm wrong, but this is nearly as much of an issue if someone wants to use a different version of MSVC to compile CPython than the version used to build the official binaries. It requires care, but you can't deny that there are use cases where people will want and need to do such things. Is possible fragmentation a good enough reason to resist making it possible in the build system?
though I suspect most would like to see some traction achieved on a fork first
Those of us who consider this important should probably just do this. Ray, Roumen, the maintainer of the Arch MinGW packages, myself and others could look into making an actual fork on Github or Bitbucket where we merge the various patches and come up with an out-of-the-box MinGW-[cross-]compilable version of CPython. I'll happily write the spec files to get this building from Fedora or openSUSE. That would help us test the feasibility from a centralized repository. Ray, what do you think? Do you know xantares' email address to ask if he'd be interested in helping or using the result?
I like this idea. To reduce the workload, we should probably pick Python3 (at least initially)?
I have collaborated with xantares on the ArchLinux AUR mingw-w64-python2 package (whom I've bcc'ed). In fact, as of now, our patches are exactly the same except ArchLinux is missing one new patch. Looking at https://aur.archlinux.org/packages/mingw-w64-python2/ it seems xantares handed over maintainer-ship to Dr Shadow. I've left a comment asking for the new maintainer to email me.
If we pick Python3 instead of 2 then bringing up an ArchLinux AUR package for that would be my next course of action. Cross-compilation of mingw-w64-python3 will no doubt need some fixes as I've not done it for a while.
It seems that Dr Shadow has already done this: https://aur.archlinux.org/packages/mingw-w64-python/ and, happily, the used patches are the exact same ones as used on MSYS2.
Ideally, we'd hook this repository up to as complete a CI system as possible and introduce each patch one at a time so that any and every breakage or regression on the currently supported systems gets fixed immediately. Also having reviews from some core Python developers (if we can get motivated supporters from that group) would be immensely helpful. My fear is that without such core involvement, the attempt to upstream the final patch-set would be overwhelming.
Zachary Ware:
I'm failing to see where that's simpler :)
If it were hypothetically merged instead of out in an external fork, it could be ./configure --host=x86_64-w64-mingw32 && make to cross-compile from Linux or Cygwin, or just ./configure && make after installing MSYS2 (which is just about as easy as installing MSVC) on Windows.
Paul Moore:
If it were possible to cross-compile compatible extensions on Linux, projects developed on Linux could supply Windows binaries much more easily, which would be a huge benefit to the whole Windows Python community.
I want to do exactly this in an automated repeatable way, preferably on a build service. This seems harder to do when CPython cannot itself be built and handled as a dependency by that same automated, repeatable build service. Unless it becomes possible to cross-compile extensions using the build machine's own version of Python, which might be the right approach.
acutely aware of the common pain points for Python users on Windows. And they are all about binary extensions, and none at all about building Python itself.
I've done a lot of recent work keeping Julia working well on Windows, and the interoperability we have with Python packages has propagated most of these pain points to us as well. We have to rely on Conda in order to have a reliable way of installing, as an example, IPython with the notebook interface, in order for IJulia to work. This is not an ideal solution as it requires a great deal of user intervention and manual steps to get up and running (and it would be far worse without Conda). We are, so far, built around MinGW-w64 on Windows, for the reasons I listed above. Having cross- compiled builds of CPython and binary extensions available from the same build services we already use to install other binary packages (Gtk, Cairo, Tk, Nettle, HDF5, etc) on Windows would be enormously helpful for us.
There's a real use case. Its size and importance can be debated. For now I'll take David Murray's post to heart and see where I have time or ability to help things along.
Sincerely, Tony

If this includes (or would likely include) a significant portion of the Scientific Computing community, I would think that would be a compelling use case.
I can't speak for any of the scientific computing community besides myself, but my thoughts: much of the development, as you know, happens on Linux with GCC (or OSX with clang). But it's important for users across all platforms to be able to install binaries with a minimum of fuss. Limitations of MSVC have already led the numpy/scipy community to investigate building with MinGW-w64. (See several related threads from April on the numpy-discussion list.) Ensuring compatibility with CPython's chosen msvcrt has made that work even more difficult for them. And Julia is not yet a significant portion of anything, but our community is growing rapidly. See https://github.com/JuliaLang/IJulia.jl/pull/211 - with respect to IJulia, "Python is just an implementation detail." Even such a silly thing as automating the execution of the Python installer, to set up a private only-used-by-IJulia copy, is needlessly difficult to do. The work on Jupyter will hopefully help this specific situation sooner or later, but there are other cases where CPython needs to serve as part of the infrastructure, and the status quo makes that harder to automate.
We'd need to be educated more about the reasons why this approach works better than remaining compatible with MSVC CPython so we could evaluate the risks and reward intelligently.
Ideally, we can pursue an approach that would be able to remain compatible with MSVC CPython. Even if this needs involvement from MinGW-w64 to make happen, I don't think it's intractable. But I know less about the inner details of CPython than you do so I could be wrong.
But as has been discussed, it seems better to focus first on fixing the issues on which we are all in agreement (building extensions with MinGW).
Yes. We'll look into how much of the work has already been done on this.
there *are* people on the core-mentorship list who have expressed interest in helping out with our automated testing infrastructure, including (if I understand correctly) adding some level of integration to other CI systems (which might just be messages to the IRC channel)[*]. So that could be a fruitful avenue to explore.
If we pursue a fork (which not everyone will like but might happen anyway) then we likely would do this type of CI integration along the way as Ray suggested. So even if it turns out to fail as an endeavor, some good may come of it. Sincerely, Tony

On 26 October 2014 17:59, Tony Kelman <kelman@berkeley.edu> wrote:
Ensuring compatibility with CPython's chosen msvcrt has made that work even more difficult for them.
Ensuring compatibility with CPython's msvcrt is mandatory unless you want to create a split in the community over which extensions work with which builds. That's precisely the scenario Steve Dower and myself (among others) fear, and want to avoid at all cost. Paul

On 26 October 2014 14:28, Ray Donnelly <mingw.android@gmail.com> wrote:
I like this idea. To reduce the workload, we should probably pick Python3 (at least initially)?
Aren't the existing patches on the tracker already for Python 3.5+? They should be, as that's the only version that's likely to be a possible target (unless you get someone to agree to allow a change like this as in scope for Pythhon 2.7, which I've seen no indication of). Maybe I'm confused here. Paul

On Sun, 26 Oct 2014 06:12:45 -0700, "Tony Kelman" <kelman@berkeley.edu> wrote:
Steve Dower:
Building CPython for Windows is not something that needs solving.
Not in your opinion, but numerous packagers of MinGW-based native or cross-compiled package sets would love to include Python. The fact that they currently can't, without many patches, is a problem.
If this includes (or would likely include) a significant portion of the Scientific Computing community, I would think that would be a compelling use case. We'd need to be educated more about the reasons why this approach works better than remaining compatible with MSVC CPython so we could evaluate the risks and reward intelligently. (I wonder..."things are going to fragment anyway if you (cpython) don't do anything" might be an argument, if true...but wouldn't make the consequences any easier to deal with :( But as has been discussed, it seems better to focus first on fixing the issues on which we are all in agreement (building extensions with MinGW).
R. David Murray:
And, at this point, we would NEED A BUILDBOT. That is, a machine that has whatever tools are required installed such that tests added to the test suite to test MinGW support in distutils would run, so we can be sure we don't break anything when making other changes.
That's not too hard. I've done this for other projects. AppVeyor works if your build is short enough, and I've done cross-compilation from Travis CI for other projects. Or Jenkins, or a Vagrant VM. I don't know PSF's infrastructure, but I can offer guidance if it would help.
When I say "we need a buildbot", what I mean is that we need someone willing to donate the resources and the *time and expertise* to setting up and maintaining something that integrates with our existing buildbot setup. You set up a buildbot slave, request an id and password from Antoine, keep the thing running, and respond in a timely fashion to requests for help resolving issues that arise on the buildbot (both buildbot issues and help-me-diagnose-this-failure issues). After the initial setup the load isn't generally heavy (I haven't had to do anything with the OSX buildbot running on the machine in my dinning room for months and months now, for example). So your guidance would have to go to someone who was volunteering to take on this task...there isn't anyone on the existing core team who would have time to do it (if I'm wrong, I'm sure someone will speak up). On the other hand, you don't have to be a committer to run a buildbot, and there *are* people on the core-mentorship list who have expressed interest in helping out with our automated testing infrastructure, including (if I understand correctly) adding some level of integration to other CI systems (which might just be messages to the IRC channel)[*]. So that could be a fruitful avenue to explore. --David [*] This is an area in which I have an interest, but it hasn't gotten high enough on my todo list yet for me to figure out exactly what the current state of things is so I can help it along.

On 26 October 2014 13:12, Tony Kelman <kelman@berkeley.edu> wrote:
Only cross-compilation and the build system in the above list are relevant to CPython, but I hope I have convinced you, Paul Moore, etc. that there are real reasons for some groups of users and developers to prefer MinGW-w64 over MSVC.
Not really, to be honest. I still don't understand why anyone not directly involved in CPython development would need to build their own Python executable on Windows. Can you explain a single specific situation where installing and using the python.org executable is not possible (on the assumption that the mingw build is functionally identical and ABI compatible with the CPython build, the claim being made here)? Note that "not possible" is different from "I don't want to" or "it doesn't fit my views about free software" or similar. Also note that building extensions is different - you have to assume that building extensions using mingw with a mingw-built CPython is just as hard as building them with a MSVC-built CPython (otherwise you've made changes to extension building and you should contribute them independently so that everyone can benefit, not just those who build their own Python with mingw!)
Paul Moore:
If it were possible to cross-compile compatible extensions on Linux, projects developed on Linux could supply Windows binaries much more easily, which would be a huge benefit to the whole Windows Python community.
I want to do exactly this in an automated repeatable way, preferably on a build service. This seems harder to do when CPython cannot itself be built and handled as a dependency by that same automated, repeatable build service.
I cannot see why you would need to build Python in order to build extensions. After all, if your build service is on Linux, it couldn't run a mingw-built Python anyway. If your build service is a Windows machine, just install the python.org binaries (which is a simple download and install, that can be fully automated, but which is a one-off process anyway).
Unless it becomes possible to cross-compile extensions using the build machine's own version of Python, which might be the right approach.
This may be where we are getting confused. I see this as the only practical way of cross-compiling Windows extensions on Linux, by using the Linux Python. So being able to cross-compile Python is not relevant. On a tangential note, any work on supporting mingw builds and cross-compilation should probably be done using setuptools, so that it is external to the CPython code. That way (a) it isn't constrained by the CPython release schedules and backward compatibility constraints, and (b) it can be used in older versions of Python (which is pretty much essential if it's to be useful, TBH). Paul

On Sun, Oct 26, 2014 at 10:41 PM, Paul Moore <p.f.moore@gmail.com> wrote:
On 26 October 2014 13:12, Tony Kelman <kelman@berkeley.edu> wrote:
Only cross-compilation and the build system in the above list are relevant to CPython, but I hope I have convinced you, Paul Moore, etc. that there are real reasons for some groups of users and developers to prefer MinGW-w64 over MSVC.
Not really, to be honest. I still don't understand why anyone not directly involved in CPython development would need to build their own Python executable on Windows. Can you explain a single specific situation where installing and using the python.org executable is not possible (on the assumption that the mingw build is functionally identical and ABI compatible with the CPython build, the claim being made here)?
I don't know where this "ABI compatible" thing came into being; I think Steve Dower eluded to it by stating that we should focus on enabling MinGW-w64 as an extension-building compiler, using a core interpreter built with MSVC, and that by limiting the interfaces to the Windows C calling conventions everything would be OK. Unfortunately this is not possible. MinGW-w64-built extensions need to link to msvcrt.dll to do anything useful and you cannot mix two different msvcr??.dlls in one application. Please see http://msdn.microsoft.com/en-us/library/abx4dbyh%28v=VS.100%29.aspx and http://msdn.microsoft.com/en-us/library/ms235460%28v=VS.100%29.aspx for the details. MinGW-w64 assumes the very old msvcrt.dll files from Windows XP SP3 and XP64 specifically to avoid this mess. The rest of your reply assumes that this ABI compatibility is a given so I'll stop at this point.
Note that "not possible" is different from "I don't want to" or "it doesn't fit my views about free software" or similar. Also note that building extensions is different - you have to assume that building extensions using mingw with a mingw-built CPython is just as hard as building them with a MSVC-built CPython (otherwise you've made changes to extension building and you should contribute them independently so that everyone can benefit, not just those who build their own Python with mingw!)
Paul Moore:
If it were possible to cross-compile compatible extensions on Linux, projects developed on Linux could supply Windows binaries much more easily, which would be a huge benefit to the whole Windows Python community.
I want to do exactly this in an automated repeatable way, preferably on a build service. This seems harder to do when CPython cannot itself be built and handled as a dependency by that same automated, repeatable build service.
I cannot see why you would need to build Python in order to build extensions. After all, if your build service is on Linux, it couldn't run a mingw-built Python anyway. If your build service is a Windows machine, just install the python.org binaries (which is a simple download and install, that can be fully automated, but which is a one-off process anyway).
Unless it becomes possible to cross-compile extensions using the build machine's own version of Python, which might be the right approach.
This may be where we are getting confused. I see this as the only practical way of cross-compiling Windows extensions on Linux, by using the Linux Python. So being able to cross-compile Python is not relevant.
On a tangential note, any work on supporting mingw builds and cross-compilation should probably be done using setuptools, so that it is external to the CPython code. That way (a) it isn't constrained by the CPython release schedules and backward compatibility constraints, and (b) it can be used in older versions of Python (which is pretty much essential if it's to be useful, TBH).
Paul

On 26 October 2014 23:11, Ray Donnelly <mingw.android@gmail.com> wrote:
I don't know where this "ABI compatible" thing came into being;
Simple. If a mingw-built CPython doesn't work with the same extensions as a MSVC-built CPython, then the community gets fragmented (because you can only use the extensions built for your stack). Assuming numpy needs mingw and ultimately only gets built for a mingw-compiled Python (because the issues building for MSVC-built Python are too hard) and assuming that nobody wants to make the effort to build pywin32 under mingw, then what does someone who needs both numpy and pywin32 do? Avoiding that issue is what I mean by ABI-compatible. (And that's all I mean by it, nothing more subtle or controversial). I view it as critical (because availability of binaries is *already* enough of a problem in the Windows world, without making it worse) that we avoid this sort of fragmentation. I'm not seeing an acknowledgement from the mingw side that they agree. That's my concern. If we both agree, there's nothing to argue about. Paul

On 27 October 2014 09:44, Paul Moore <p.f.moore@gmail.com> wrote:
I view it as critical (because availability of binaries is *already* enough of a problem in the Windows world, without making it worse) that we avoid this sort of fragmentation. I'm not seeing an acknowledgement from the mingw side that they agree. That's my concern. If we both agree, there's nothing to argue about.
I think there's consensus on this front. From Ray: "MinGW-w64 assumes the very old msvcrt.dll files from Windows XP SP3 and XP64 specifically to avoid this mess." That assumption will allow MinGW-w64 to link with the appropriate MSVCRT versions for extention building without anything breaking. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

Nick Coghlan wrote:
That assumption will allow MinGW-w64 to link with the appropriate MSVCRT versions for extention building without anything breaking.
If that works, then the same technique should allow CPython itself to be built in a VS-compatible way with mingw, shouldn't it? Those objecting to a mingw-built python seem to be assuming that such a thing will necessarily be incompatible with VS builds, but I don't see why that has to be the case. -- Greg

Greg Ewing wrote:
Nick Coghlan wrote:
That assumption will allow MinGW-w64 to link with the appropriate MSVCRT versions for extention building without anything breaking.
If that works, then the same technique should allow CPython itself to be built in a VS-compatible way with mingw, shouldn't it?
Those objecting to a mingw-built python seem to be assuming that such a thing will necessarily be incompatible with VS builds, but I don't see why that has to be the case.
That's true, and a good point that I missed. However, the main (practical) desire for building CPython with something other than VS seems to be to avoid having to be compatible with VS. It's entirely possible that having two alternative builds of CPython would force everyone to be more compatible, but I think it's more likely to simply end up being two different worlds. Maybe I'm being unnecessarily cynical :) Cheers, Steve
-- Greg

On Mon, Oct 27, 2014 at 8:54 PM, Steve Dower <Steve.Dower@microsoft.com> wrote:
Greg Ewing wrote:
Nick Coghlan wrote:
That assumption will allow MinGW-w64 to link with the appropriate MSVCRT versions for extention building without anything breaking.
If that works, then the same technique should allow CPython itself to be built in a VS-compatible way with mingw, shouldn't it?
Those objecting to a mingw-built python seem to be assuming that such a thing will necessarily be incompatible with VS builds, but I don't see why that has to be the case.
That's true, and a good point that I missed. However, the main (practical) desire for building CPython with something other than VS seems to be to avoid having to be compatible with VS.
I've no idea where you get that impression from, no one has expressed anything even approximating that. For me it's to avoid using closed source software for my hobbyist programming and to help to create a vibrant Open Source distribution for Windows, because I quite like Windows; it's got a lot going for it. For others it's to ensure that everything they care about (CPython with Fortran for example) works together properly and reliably. I expect that avoiding compatibility couldn't be further from any of our wishes.
It's entirely possible that having two alternative builds of CPython would force everyone to be more compatible, but I think it's more likely to simply end up being two different worlds. Maybe I'm being unnecessarily cynical :)
Cheers, Steve
-- Greg
Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/mingw.android%40gmail.com

On 27 October 2014 20:45, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Nick Coghlan wrote:
That assumption will allow MinGW-w64 to link with the appropriate MSVCRT versions for extention building without anything breaking.
If that works, then the same technique should allow CPython itself to be built in a VS-compatible way with mingw, shouldn't it?
Yes.
Those objecting to a mingw-built python seem to be assuming that such a thing will necessarily be incompatible with VS builds, but I don't see why that has to be the case.
No, we've been trying to establish whether the patches to build with mingw were intended to produce such a compatible build. It's not clear, but so far it seems that apparently that is *not* the intent (and worse, mingw-w64 may not even be able to build viable executables that link with msvcr100 without some heavy hacking, although that's still somewhat unclear). Paul

Paul Moore wrote: On 27 October 2014 20:45, Greg Ewing <greg.ewing@canterbury.ac.nz> wrote:
Nick Coghlan wrote:
That assumption will allow MinGW-w64 to link with the appropriate MSVCRT versions for extention building without anything breaking.
If that works, then the same technique should allow CPython itself to be built in a VS-compatible way with mingw, shouldn't it?
Yes.
Those objecting to a mingw-built python seem to be assuming that such a thing will necessarily be incompatible with VS builds, but I don't see why that has to be the case.
No, we've been trying to establish whether the patches to build with mingw were intended to produce such a compatible build. It's not clear, but so far it seems that apparently that is *not* the intent (and worse, mingw-w64 may not even be able to build viable executables that link with msvcr100 without some heavy hacking, although that's still somewhat unclear).
Unless there is also opposition to moving to VC14, I'd rather see the mingw projects invest in linking to those libraries. I believe they'll have a much easier time of it than worrying about VC10, and the investment will be worth more in the future as the public API of the CRT stops changing. Unfortunately, I'm not able to help out more than I've already offered (researching answers to specific questions). Largely because I have enough work-outside-work going on, but also because my employer won't like me getting involved with GPL'd software at all. Cheers, Steve
Paul

On 27 October 2014 21:19, Steve Dower <Steve.Dower@microsoft.com> wrote:
No, we've been trying to establish whether the patches to build with mingw were intended to produce such a compatible build. It's not clear, but so far it seems that apparently that is *not* the intent (and worse, mingw-w64 may not even be able to build viable executables that link with msvcr100 without some heavy hacking, although that's still somewhat unclear).
Unless there is also opposition to moving to VC14, I'd rather see the mingw projects invest in linking to those libraries. I believe they'll have a much easier time of it than worrying about VC10, and the investment will be worth more in the future as the public API of the CRT stops changing.
I think the point is that anything other than msvcrt is extra work, because using msvcrt is coded into the guts of gcc (which in turn is because msvcrt is apparently OK to consider as "part of the OS" in GPL legality terms). So whether it's the vc10 libraries or the vc14 ones is irrelevant - and mingw ships with the vc10 link library, so it's easier to discuss the problem in terms of vc10. But yes, vc14 would be the long term target. Of course if the vc14 libs were deemed as "shipped with the OS" and/or were named msvcrt.dll, then that would be different. But I assume that's not what will happen. Paul

On 26 October 2014 23:44, Paul Moore <p.f.moore@gmail.com> wrote:
On 26 October 2014 23:11, Ray Donnelly <mingw.android@gmail.com> wrote:
I don't know where this "ABI compatible" thing came into being;
Simple. If a mingw-built CPython doesn't work with the same extensions as a MSVC-built CPython, then the community gets fragmented (because you can only use the extensions built for your stack). Assuming numpy needs mingw and ultimately only gets built for a mingw-compiled Python (because the issues building for MSVC-built Python are too hard) and assuming that nobody wants to make the effort to build pywin32 under mingw, then what does someone who needs both numpy and pywin32 do?
Avoiding that issue is what I mean by ABI-compatible. (And that's all I mean by it, nothing more subtle or controversial).
I view it as critical (because availability of binaries is *already* enough of a problem in the Windows world, without making it worse) that we avoid this sort of fragmentation. I'm not seeing an acknowledgement from the mingw side that they agree. That's my concern. If we both agree, there's nothing to argue about.
I have just done some experiments with building CPython extensions with mingw-w64. Thanks to Ray for helping me set this up. The bad news is that the support added to the old 32-bit mingw to support linking to alternative C runtime libraries (specifically -lmsvcr100) has bitrotted, and no longer functions correctly in mingw-w64. As a result, not only can mingw-w64 not build extensions that are compatible with python.org Python, it can't build extensions that function at all [1]. They link incompatibly to *both* msvcrt and msvcr100. This is a bug in mingw-w64. I have reported it to Ray, who's passed it onto one of the mingw-w64 developers. But as things stand, mingw builds will definitely produce binary extensions that aren't compatible with python.org Python. Paul [1] Note, that's if you just use --compiler=mingw32 as supported by distutils. Looking at how the numpy folks build, they seem to hack their own version of the distutils C compiler classes. I don't know whether that's just to work around this bug, or whether they do it for other reasons as well (but I suspect the latter).

On Mon, Oct 27, 2014 at 10:48 AM, Paul Moore <p.f.moore@gmail.com> wrote:
The bad news is that the support added to the old 32-bit mingw to support linking to alternative C runtime libraries (specifically -lmsvcr100) has bitrotted, and no longer functions correctly in mingw-w64. As a result, not only can mingw-w64 not build extensions that are compatible with python.org Python, it can't build extensions that function at all [1]. They link incompatibly to *both* msvcrt and msvcr100.
This is a bug in mingw-w64. I have reported it to Ray, who's passed it onto one of the mingw-w64 developers. But as things stand, mingw builds will definitely produce binary extensions that aren't compatible with python.org Python.
Paul
[1] Note, that's if you just use --compiler=mingw32 as supported by distutils. Looking at how the numpy folks build, they seem to hack their own version of the distutils C compiler classes. I don't know whether that's just to work around this bug, or whether they do it for other reasons as well (but I suspect the latter). _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/casevh%40gmail.com
I've managed to build gmpy2 (which requires GMP, MPFR, and MPC libraries) using msys2. I've detailed the steps (hacking) at: https://code.google.com/p/gmpy/source/browse/trunk/msys2_build.txt One of the hacks I made addresses the linking bug. The extension does run with the both the 32-bit and 64-bit versions of CPython 2.7, 3.2, 3.3, and 3.4. It is possible, just not easy. Anything that makes is easier would be very helpful. casevh

On 27 October 2014 18:47, Case Van Horsen <casevh@gmail.com> wrote:
I've managed to build gmpy2 (which requires GMP, MPFR, and MPC libraries) using msys2. I've detailed the steps (hacking) at:
https://code.google.com/p/gmpy/source/browse/trunk/msys2_build.txt
Thanks for this. I don't have the time to read your notes right now, but I will do so.
One of the hacks I made addresses the linking bug. The extension does run with the both the 32-bit and 64-bit versions of CPython 2.7, 3.2, 3.3, and 3.4.
Did you report the linking bug to the mingw-w64 project? They key thing here is that without gcc -lmsvcrt100 foo.c working (i.e., not resulting in linking with msvcrt), building Python extensions will always need hacks to workaround that bug.
It is possible, just not easy. Anything that makes is easier would be very helpful.
With the bug fixed, the steps should be as trivial as: 1. Using python.org Python, with gcc on your PATH. 2. Install any dependencies (e.g., gmp) where gcc can see them. 3. python setup.py build_ext --compiler=mingw32 bdist_wheel (or whatever setup.py invocation suits you, as long as you set compiler=mingw32). Paul

On Mon, Oct 27, 2014 at 5:48 PM, Paul Moore <p.f.moore@gmail.com> wrote:
On 26 October 2014 23:44, Paul Moore <p.f.moore@gmail.com> wrote:
On 26 October 2014 23:11, Ray Donnelly <mingw.android@gmail.com> wrote:
I don't know where this "ABI compatible" thing came into being;
Simple. If a mingw-built CPython doesn't work with the same extensions as a MSVC-built CPython, then the community gets fragmented (because you can only use the extensions built for your stack). Assuming numpy needs mingw and ultimately only gets built for a mingw-compiled Python (because the issues building for MSVC-built Python are too hard) and assuming that nobody wants to make the effort to build pywin32 under mingw, then what does someone who needs both numpy and pywin32 do?
Avoiding that issue is what I mean by ABI-compatible. (And that's all I mean by it, nothing more subtle or controversial).
I view it as critical (because availability of binaries is *already* enough of a problem in the Windows world, without making it worse) that we avoid this sort of fragmentation. I'm not seeing an acknowledgement from the mingw side that they agree. That's my concern. If we both agree, there's nothing to argue about.
I have just done some experiments with building CPython extensions with mingw-w64. Thanks to Ray for helping me set this up.
The bad news is that the support added to the old 32-bit mingw to support linking to alternative C runtime libraries (specifically -lmsvcr100) has bitrotted, and no longer functions correctly in mingw-w64. As a result, not only can mingw-w64 not build extensions that are compatible with python.org Python, it can't build extensions that function at all [1]. They link incompatibly to *both* msvcrt and msvcr100.
This is a bug in mingw-w64. I have reported it to Ray, who's passed it onto one of the mingw-w64 developers. But as things stand, mingw builds will definitely produce binary extensions that aren't compatible with python.org Python.
IIUC, getting mingw-w64 to link against msvcr100 instead of msvcrt requires a custom mingw-w64 build, because by default mingw-w64's internal runtime libraries (libgcc etc.) are linked against msvcrt. So by the time you're choosing compiler switches etc., it's already too late -- your switches might affect how *your* code is built, but your code will still be linked against pre-existing runtime libraries that are linked against msvcrt. It's possible to hack the mingw-w64 build process to build the runtime libraries against msvcr100 (or whatever) instead of msvcrt, but this is still not a panacea -- the different msvcr* libraries are, of course, incompatible with each other, and IIUC the mingw-w64 developers have never tried to make their libraries work against anything except msvcrt. For example, mingw-w64's gfortran runtime uses a symbol that's only available in msvcrt, not msvcr90 or msvcrt100: http://sourceforge.net/p/mingw-w64/mailman/message/31768118/ So my impression is that these issues are all fixable, but they will require real engagement with mingw-w64 upstream.
[1] Note, that's if you just use --compiler=mingw32 as supported by distutils. Looking at how the numpy folks build, they seem to hack their own version of the distutils C compiler classes. I don't know whether that's just to work around this bug, or whether they do it for other reasons as well (but I suspect the latter).
numpy.distutils is a massive pile of hacks to handle all kinds of weird things including recursive builds, fortran, runtime capability detection (like autoconf), and every random issue anyone ran into at some point in the last 10 years and couldn't be bothered filing a proper upstream bug report. Basically no-one knows what it actually does -- the source is your only hope :-). -n -- Nathaniel J. Smith Postdoctoral researcher - Informatics - University of Edinburgh http://vorpus.org

Not really, to be honest. I still don't understand why anyone not directly involved in CPython development would need to build their own Python executable on Windows. Can you explain a single specific situation where installing and using the python.org executable is not possible
I want, and in many places *need*, an all-MinGW stack. For a great deal of software that is not Python, I can do this today. I can use build services, package management, and dependency resolution tools that work very well together with this all-MinGW software stack. These are problems that Python has notoriously struggled with on Windows for a long time. It's not "my views on free software," it's the reality of MSVC being a near-useless compiler for scientific software. (And I don't see that changing much.) Do my requirements conflict with many non-scientific Python users on Windows? Probably. So you're welcome to ignore my requirements and I'll do my own thing, but I don't think I'm alone. There's likely no desire from the scientific Python community to branch off and separate in quite the way I'm willing to do from non-scientific Python, but it would solve some of their problems (while introducing many others). I suspect a MinGW-w64-oriented equivalent to Conda would be attractive to many. That's approximately what I'm aiming for. There are some ways in which I can use the python.org MSVC executable and installer. But it is nearly impossible for me to integrate it into the rest of the tools and stack that I am using; it sticks out like a sore thumb. Likewise MinGW-compiled CPython may stick out like a sore thumb relative to the existing way things work with Python on Windows. I'm okay with that, you probably aren't.
changes to extension building and you should contribute them independently so that everyone can benefit
Noted.
I cannot see why you would need to build Python in order to build extensions.
No, of course they are separate. CPython is one of my dependencies. Compiled extensions are other dependencies. Software completely unrelated to Python is yet another set of dependencies. It's not a very coherent stack if I can't handle all of these dependencies in a uniform way.
On a tangential note, any work on supporting mingw builds and cross-compilation should probably be done using setuptools, so that it is external to the CPython code.
Noted. Sincerely, Tony

On 26 October 2014 23:24, Tony Kelman <kelman@berkeley.edu> wrote:
I want, and in many places *need*, an all-MinGW stack.
OK, I'm willing to accept that statement. But I don't understand it, and I don't think you've explained why you *need* your CPython interpreter to be compiled with mingw (as opposed to a number of other things you might need around building extensions). You may well "need" a mingw-compiled CPython because no-one has yet fixed the issues around using mingw to build extensions for the python.org python build. But that's my point - I'd rather "they" fixed that issue, rather than perpetuating your need for a non-standard compiler that uses extensions no-one else can use. Paul

On 27 October 2014 09:37, Paul Moore <p.f.moore@gmail.com> wrote:
On 26 October 2014 23:24, Tony Kelman <kelman@berkeley.edu> wrote:
I want, and in many places *need*, an all-MinGW stack.
OK, I'm willing to accept that statement. But I don't understand it, and I don't think you've explained why you *need* your CPython interpreter to be compiled with mingw (as opposed to a number of other things you might need around building extensions).
I can take a go at an explanation that may make more sense to you. Consider one of our key requirements for packaging applications for Fedora: that Fedora builds be *self-hosting*. Given a base Fedora system, and a source RPM, we need to be able to *build the binary RPM from source*. (Other Linux distros generally have a similar requirement) Relying on opaque binary blobs downloaded from the internet as part of the build process is not permitted (modulo a few exceptions for firmware blobs in device drivers). Now consider that this "automatically rebuild the entire system from source" model is not unique to Linux - you can use it for any system where your build process is sufficiently automated, and you have a need for it. However, the *structure* of those kind of automation tends to differ wildly between POSIX style tooling (gcc, clang) and MSVC. If you have an existing build automation system for *nix targets, then cross-compilation via MinGW is likely going to be your smoothest path to adding Windows binary support. At that point, if CPython is one of your dependencies, you're going to have the choice of allowing the python.org binaries to be pulled in as opaque pre-built blobs, or else figuring out how to build an ABI compatible version with MinGW rather than with MSVC. Think of this more in the case of *embedding* the CPython runtime in a larger context (e.g. in Tony's case, to make Python software usable with the Julia runtime), rather than in building a standalone Python interpreter for general use. So, for embedding cases, and for incorporation into POSIX-style build systems using MinGW-w64 for cross-compilation of Windows binaries, it may make sense to incorporate the patches that allow building with MinGW-w64 into mainline CPython (if I recall correctly, we supported building with Intel's C compiler for a long time, even though we never shipped anything built with it). Regards, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On 27 October 2014 12:30, Nick Coghlan <ncoghlan@gmail.com> wrote:
OK, I'm willing to accept that statement. But I don't understand it, and I don't think you've explained why you *need* your CPython interpreter to be compiled with mingw (as opposed to a number of other things you might need around building extensions).
I can take a go at an explanation that may make more sense to you. Consider one of our key requirements for packaging applications for Fedora: that Fedora builds be *self-hosting*. Given a base Fedora system, and a source RPM, we need to be able to *build the binary RPM from source*. (Other Linux distros generally have a similar requirement)
Relying on opaque binary blobs downloaded from the internet as part of the build process is not permitted (modulo a few exceptions for firmware blobs in device drivers).
Now consider that this "automatically rebuild the entire system from source" model is not unique to Linux - you can use it for any system where your build process is sufficiently automated, and you have a need for it. However, the *structure* of those kind of automation tends to differ wildly between POSIX style tooling (gcc, clang) and MSVC. If you have an existing build automation system for *nix targets, then cross-compilation via MinGW is likely going to be your smoothest path to adding Windows binary support.
At that point, if CPython is one of your dependencies, you're going to have the choice of allowing the python.org binaries to be pulled in as opaque pre-built blobs, or else figuring out how to build an ABI compatible version with MinGW rather than with MSVC. Think of this more in the case of *embedding* the CPython runtime in a larger context (e.g. in Tony's case, to make Python software usable with the Julia runtime), rather than in building a standalone Python interpreter for general use.
So, for embedding cases, and for incorporation into POSIX-style build systems using MinGW-w64 for cross-compilation of Windows binaries, it may make sense to incorporate the patches that allow building with MinGW-w64 into mainline CPython (if I recall correctly, we supported building with Intel's C compiler for a long time, even though we never shipped anything built with it).
Thanks Nick. That explanation makes sense to me. I was aware of this sort of scenario, and as I've said before I don't have any objection per se to making things easier for people with that sort of requirement. But some of the other arguments in this thread seemed to imply more than that. Without specifics, though, I concede that I may be over-interpreting the rhetoric, so that's the part of the debate I'm stepping back from, to avoid descending into FUD. Paul

On Sun, Oct 26, 2014 at 3:41 PM, Paul Moore <p.f.moore@gmail.com> wrote:
Not really, to be honest. I still don't understand why anyone not directly involved in CPython development would need to build their own Python executable on Windows.
Late Python bugfix releases are source-only, so if you suffer from a bug and need to get it fixed, you need to build Python from source. https://www.python.org/download/releases/2.6.9/ has no windows binary and includes several security fixes. -- Devin

On Sat, 25 Oct 2014 05:45:24 -0700, "Tony Kelman" <kelman@berkeley.edu> wrote:
As a developer of a (compiled) open-source library or application, wouldn't you love to be able to build binaries on Linux for Windows? With some work and build system patches, you can. For many projects it's a simple matter of ./configure --host=x86_64-w64-mingw32. Not with CPython though. CPython is only included in 2 of the above MinGW-w64 distribution projects, MSYS2 and Arch. This is possible with a very, very long set of patches, many of which have been submitted by Roumen Petrov to the Python bug tracker - see http://bugs.python.org/issue17605 and other issues linked therein. Roumen has done a huge amount of work, and he and others who consider the MinGW-w64 compiler important will continue to do so. (Thanks a million Roumen!)
Yes, I for one appreciate Roumen's work, even though I'm not currently in a position to review the patches.
I could step in as maintainer for Cygwin and builds based on GCC using mingw* APIs.
Regards, Roumen Petrov
A maintainer has volunteered. Others will help. Can any core developers please begin reviewing some of his patches? Even if just to say they need to be rebased. The lack of responses on the bug tracker is disheartening from an outside perspective. The pile of patches accumulating in external MinGW packaging projects is tantamount to a fork of CPython. It won't go away since there are dedicated packagers working to keep their MinGW-w64 builds functional, even in the ad-hoc current state. The patches continue piling up, making it more difficult for everyone - except for the core Python developers if they continue to ignore the problem. Bring the people working on these patches into the fold as contributors. Review the patches. It would make Python as a language and a community even more diverse and welcoming.
From there I'd move on to patches that support using MinGW for building extensions. There will probably be useful to also get engaged with the
IIUC, our general policy for bringing new platforms into core support, as opposed to continuing to be a separately maintained "set of patches", is that there be a "big enough" community of interest (I don't remember the definition of "big enough") and that there be both committed maintainers *and* at least one buildbot. Being able to build windows packages on linux is a compelling argument, but that only applies to building extensions, not the interpreter. I would recommend starting with any patches that help MinGW that are not MinGW specific but instead generally improve the build system and cross compilation. There have been a number of such issues opened and improvements made beyond the MinGW related ones, some coming from Debian, some coming from the Android community. So target those first. My suggestion would be to pick a patch that is believed to be commit ready, and come to #python-dev on freenode and gently bug us until it gets committed. Then pick the next one, and repeat. Working from simplest to more complex would also probably be a good strategy :) packaging folks. And, at this point, we would NEED A BUILDBOT. That is, a machine that has whatever tools are required installed such that tests added to the test suite to test MinGW support in distutils would run, so we can be sure we don't break anything when making other changes. For compiling CPython itself with MinGW, we'd first need to develop a consensus that we want to support it in core. I'd say get building extensions working first, and then make that argument. By the time a bunch of patches get committed, we should be ready (read: eager :) to promote someone to do it themselves. Even if we never decide to support compiling CPython itself with MinGW, I would hope that getting it to work for extensions would greatly reduce the number of patches needed to be maintained outside the tree in order to do so. And once at least one MinGW advocate is part of the core team, advocacy becomes easier :) --David PS: one meta comment about the MinGW issues: I scanned a few from the linked bug, and while the issues split the patches out, which is a great start, there is no discussion on the issues for the individual patches to give background and motivation and an explanation of what the patch is trying to accomplish. But you probably don't want to waste time on improving ones that apply *only* to compiling CPython itself unless we get consensus that we want to support that.

Zitat von Tony Kelman <kelman@berkeley.edu>:
A maintainer has volunteered. Others will help. Can any core developers please begin reviewing some of his patches?
Unfortunately, every attempt to review these patches has failed for me, every time. In the last iteration of an attempt to add mingw64 support, I had asked contributors to also provide instructions on how to use these patches, and haven't received any instructions that actually worked. I'm hesitant to add code that I cannot verify as actually working. I guess anybody else reviewing these patches ran into similar problems (I know some other core developers have tried reviewing them as well, others have stated here that they are unable to review the patches). Regards, Martin

On Sun, Oct 26, 2014 at 11:52 PM, <martin@v.loewis.de> wrote:
Zitat von Tony Kelman <kelman@berkeley.edu>:
A maintainer has volunteered. Others will help. Can any core developers please begin reviewing some of his patches?
Unfortunately, every attempt to review these patches has failed for me, every time. In the last iteration of an attempt to add mingw64 support, I had asked contributors to also provide instructions on how to use these patches, and haven't received any instructions that actually worked.
I'm hesitant to add code that I cannot verify as actually working.
I guess anybody else reviewing these patches ran into similar problems (I know some other core developers have tried reviewing them as well, others have stated here that they are unable to review the patches).
https://mail.python.org/pipermail/python-dev/2014-October/136756.html
Regards, Martin
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/mingw.android%40gmail.com
participants (18)
-
Antoine Pitrou
-
Case Van Horsen
-
Chris Angelico
-
Devin Jeanpierre
-
Greg Ewing
-
Guido van Rossum
-
Mark Lawrence
-
martin@v.loewis.de
-
Nathaniel Smith
-
Nick Coghlan
-
Paul Moore
-
R. David Murray
-
Ray Donnelly
-
Stephen J. Turnbull
-
Steve Dower
-
Terry Reedy
-
Tony Kelman
-
Zachary Ware