[Python-Dev] Status of C compilers for Python on Windows

Ray Donnelly mingw.android at gmail.com
Sun Oct 26 15:28:38 CET 2014


On Sun, Oct 26, 2014 at 1:12 PM, Tony Kelman <kelman at berkeley.edu> wrote:
> Thanks all for the responses. Clearly this is a subject about which
> people feel strongly, so that's good at least. David Murray's guidance
> in particular points to the most likely path to get improvements to
> really happen.
>
> Steve Dower:
>> Building CPython for Windows is not something that needs solving.
>
> Not in your opinion, but numerous packagers of MinGW-based native or
> cross-compiled package sets would love to include Python. The fact
> that they currently can't, without many patches, is a problem.
>
>> The culture on Windows is to redistribute binaries, not source,
>
> There are many cultures using Windows. Including open-source ones.
>
>> and both the core team and a number of redistributors have this figured
>> out (and it will only become easier with VC14 and Python 3.5).
>
> With MSVC. It doesn't work with MinGW, it likely doesn't work with Clang.
> MSVC is not the only compiler on Windows. There are many use cases for
> preferring other compilers. Have you read this wiki page for example?
> https://github.com/numpy/numpy/wiki/Numerical-software-on-Windows
>
> In my personal experience, having recently gotten Julia to compile using
> MSVC for the first time, MSVC as a compiler is highly deficient for many
> needs especially in the scientific software community:
> - C99 (getting better recently, but still not done)
> - AT&T syntax assembly
> - C++11 features (also mostly okay now, but not if you're using an older
>   MSVC version with Python 2.7, which many people still have to do)
> - 128-bit integer intrinsics
> - cannot cross-compile from anything that isn't Windows
> - build systems foreign relative to shell/makefile systems used by most
>   open-source projects, few projects have time to maintain 2 separate build
>   systems (cmake helps but takes a lot of initial effort to convert to)
> - no free-as-in-beer Fortran compiler available
>
> I have none of these problems when I use MinGW-w64. Hence the desire to
> be able to curate an all-MinGW software stack. It's not a matter of open-
> source ideology for me, it's brass tacks "can I do the work I need to do."
> With MSVC I can't, with MinGW-w64 I can. Not being able to include CPython
> in an all-MinGW stack hurts, a lot.
>
> Only cross-compilation and the build system in the above list are relevant
> to CPython, but I hope I have convinced you, Paul Moore, etc. that there are
> real reasons for some groups of users and developers to prefer MinGW-w64
> over MSVC.
>
>> I'd rather see this effort thrown behind compiling extensions,
>> including cross compilation.
>
> There are patches awaiting review that improve this as well. Efforts to
> improve CPython's build system and the handling of extensions are not
> completely independent, in many cases the patches are written by the same
> set of MinGW users. One of these sets of patches is not inherently evil,
> you understandably have less interest in them but it's still disappointing
> to see so little movement on either.
>
>> Having different builds of CPython out there will only fragment the
>> community and hurt extension authors far more than it may seem to help.
>
> The community of people developing and using open-source projects, either
> CPython or otherwise, is already highly fragmented. Ignoring it makes it
> worse. python.org does not have to distribute or endorse MinGW-compiled
> builds of CPython. If the build option never gets incorporated, then it
> will continue to be reverse-engineered.
>
> Guido van Rossum:
>> Here's the crux of the matter. We want compiled extension modules
>> distributed via PyPI to work with the binaries distributed from
>> python.org.
>
> Absolutely. I don't think additional options in the build system would
> change this.
>
> R. David Murray:
>> And, at this point, we would NEED A BUILDBOT.  That is, a machine that
>> has whatever tools are required installed such that tests added to the
>> test suite to test MinGW support in distutils would run, so we can be
>> sure we don't break anything when making other changes.
>
> That's not too hard. I've done this for other projects. AppVeyor works if
> your build is short enough, and I've done cross-compilation from Travis
> CI for other projects. Or Jenkins, or a Vagrant VM. I don't know PSF's
> infrastructure, but I can offer guidance if it would help.
>
> Steve Dower:
>> I'm afraid of users having numpy crash because they're using an MSVC
>> CPython instead of a mingw CPython. I'm afraid of users not being able
>> to use library A and library B at the same time because A requires MSVC
>> CPython and B requires mingw CPython. (I can produce more examples if you
>> like, but the general concern is having a fragmented community, as I said
>> in my previous post.)
>
> A valid fear. Mixing C runtimes can cause problems, I've seen this myself.
> Correct me if I'm wrong, but this is nearly as much of an issue if someone
> wants to use a different version of MSVC to compile CPython than the version
> used to build the official binaries. It requires care, but you can't deny
> that there are use cases where people will want and need to do such things.
> Is possible fragmentation a good enough reason to resist making it possible
> in the build system?
>
>> though I suspect most would like to see some traction achieved on a fork
>> first
>
> Those of us who consider this important should probably just do this. Ray,
> Roumen, the maintainer of the Arch MinGW packages, myself and others could
> look into making an actual fork on Github or Bitbucket where we merge the
> various patches and come up with an out-of-the-box MinGW-[cross-]compilable
> version of CPython. I'll happily write the spec files to get this building
> from Fedora or openSUSE. That would help us test the feasibility from a
> centralized repository. Ray, what do you think? Do you know xantares' email
> address to ask if he'd be interested in helping or using the result?

I like this idea. To reduce the workload, we should probably pick
Python3 (at least initially)?

I have collaborated with xantares on the ArchLinux AUR
mingw-w64-python2 package (whom I've bcc'ed). In fact, as of now, our
patches are exactly the same except ArchLinux is missing one new
patch. Looking at
https://aur.archlinux.org/packages/mingw-w64-python2/ it seems
xantares handed over maintainer-ship to Dr Shadow. I've left a comment
asking for the new maintainer to email me.

If we pick Python3 instead of 2 then bringing up an ArchLinux AUR
package for that would be my next course of action. Cross-compilation
of mingw-w64-python3 will no doubt need some fixes as I've not done it
for a while.

Ideally, we'd hook this repository up to as complete a CI system as
possible and introduce each patch one at a time so that any and every
breakage or regression on the currently supported systems gets fixed
immediately. Also having reviews from some core Python developers (if
we can get motivated supporters from that group) would be immensely
helpful. My fear is that without such core involvement, the attempt to
upstream the final patch-set would be overwhelming.

>
> Zachary Ware:
>> I'm failing to see where that's simpler :)
>
> If it were hypothetically merged instead of out in an external fork, it
> could be ./configure --host=x86_64-w64-mingw32 && make to cross-compile
> from Linux or Cygwin, or just ./configure && make after installing MSYS2
> (which is just about as easy as installing MSVC) on Windows.
>
> Paul Moore:
>> If it were possible to cross-compile compatible extensions on Linux,
>> projects developed on Linux could supply Windows binaries much more
>> easily, which would be a huge benefit to the whole Windows Python
>> community.
>
> I want to do exactly this in an automated repeatable way, preferably on
> a build service. This seems harder to do when CPython cannot itself be
> built and handled as a dependency by that same automated, repeatable
> build service. Unless it becomes possible to cross-compile extensions
> using the build machine's own version of Python, which might be the right
> approach.
>
>> acutely aware of the common pain points for Python users on Windows.
>> And they are all about binary extensions, and none at all about
>> building Python itself.
>
> I've done a lot of recent work keeping Julia working well on Windows, and
> the interoperability we have with Python packages has propagated most of
> these pain points to us as well. We have to rely on Conda in order to have
> a reliable way of installing, as an example, IPython with the notebook
> interface, in order for IJulia to work. This is not an ideal solution as it
> requires a great deal of user intervention and manual steps to get up and
> running (and it would be far worse without Conda). We are, so far, built
> around MinGW-w64 on Windows, for the reasons I listed above. Having cross-
> compiled builds of CPython and binary extensions available from the same
> build services we already use to install other binary packages (Gtk, Cairo,
> Tk, Nettle, HDF5, etc) on Windows would be enormously helpful for us.
>
> There's a real use case. Its size and importance can be debated. For now
> I'll take David Murray's post to heart and see where I have time or ability
> to help things along.
>
> Sincerely,
> Tony
>


More information about the Python-Dev mailing list