<p dir="ltr"><br>
On 27 May 2016 04:48, "Donald Stufft" <<a href="mailto:donald@stufft.io">donald@stufft.io</a>> wrote:<br>
> > On May 26, 2016, at 2:41 PM, Matthew Brett <<a href="mailto:matthew.brett@gmail.com">matthew.brett@gmail.com</a>> wrote:<br>
> > It would be very good to work out a plan for new Python releases as<br>
> > well. We really need to get wheels up to pypi a fair while before the<br>
> > release date, and it's easy to forget to do that, because, at the<br>
> > moment, we don't have much testing infrastructure to make sure that a<br>
> > range of wheel installs are working OK.<br>
><br>
> I want to get something setup that would allow people to only need to upload<br>
> a source release to PyPI and then have wheels automatically built for them<br>
> (but not mandate that- Projects that wish it should always be able to control<br>
> their wheel generation).</p>
<p dir="ltr">A possible preceding step for that might be to create a service that reports per-project information on clients downloading the sdist versions of a project. With the Big Query data publicly available, that shouldn't need to be part of PyPI itself (at least in the near term), so it should just need an interested volunteer, rather than being gated on Warehouse.</p>
<p dir="ltr">The intent there would be to allow projects that decide to build their own wheels to prioritise their wheel creation, and be able to quantify the impact of each addition to their build matrix, rather than necessarily providing pre-built binaries for all supported platforms supported by the source release.</p>
<p dir="ltr">The other thing that data could be used for is to start quantifying the throughput requirements for an "all of PyPI" build service by looking at release publication rates (rather than download rates). Again, likely pursuable by volunteers using the free tier of a public PaaS, rather than requiring ongoing investment.</p>
<p dir="ltr">Cheers,<br>
Nick.<br>
</p>