
I have started to make manylinux wheels for reportlab.
Our work flow is split across multiple machines. In the end we create a total of 19 package files (10 manylinux, 8 windows + 1 source); these total 53Mb.
1) Is there a convenient way to upload a new version starting from the package files themselves? Normally we try to test the packages before they are uploaded which implies we cannot just use the distutils upload command.
2) I assume I cannot just keep on uploading new versions to pypi. Presumably I would have to delete a micro release before uploading a new one and only keep significant releases.
3) The manylinux builds are significantly larger than the windows ones because the manylinux build is not statically linking those bits of freetype which we use. Is there a way to detect that I'm building under manylinux?

1. There is a tool called twine that is the best way to upload to pypi
2. I'm not aware of any aggregate limits but I'm pretty sure each individual file can only be so big
3. Maybe the platform returns as manylinux1? Set an environment variable to ask for static linking, and check for it in your build script?
On Mon, Jul 25, 2016 at 8:05 AM Robin Becker robin@reportlab.com wrote:
I have started to make manylinux wheels for reportlab.
Our work flow is split across multiple machines. In the end we create a total of 19 package files (10 manylinux, 8 windows + 1 source); these total 53Mb.
- Is there a convenient way to upload a new version starting from the
package files themselves? Normally we try to test the packages before they are uploaded which implies we cannot just use the distutils upload command.
- I assume I cannot just keep on uploading new versions to pypi.
Presumably I would have to delete a micro release before uploading a new one and only keep significant releases.
- The manylinux builds are significantly larger than the windows ones
because the manylinux build is not statically linking those bits of freetype which we use. Is there a way to detect that I'm building under manylinux? -- Robin Becker _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig

On 25/07/2016 15:30, Daniel Holth wrote:
- There is a tool called twine that is the best way to upload to pypi
thanks I'll check that out.
- I'm not aware of any aggregate limits but I'm pretty sure each
individual file can only be so big
In our private readonly pypi we have 93 releases. I don't think that burden should fall on pypi. However, it's not clear to me if I should push micro releases to pypi and then remove them when another release is made. Is there a way to remove a 'release' completely? The edit pages seem to suggest so, but does that remove the files?
- Maybe the platform returns as manylinux1? Set an environment variable to
ask for static linking, and check for it in your build script?
....... I did try manylinux1 (after PEP 513), but it didn't seem to work; looked at sys platform, os name and the platform module, but see only this
platform=Linux-3.16.0-50-generic-x86_64-with-redhat-5.11-Final sys.platform=linux2 os.name=posix
however, it's easy enough to export an environment variable in the docker startup script.
I did try to reduce my manylinux sizes by using a library of shared object codes (ie a .a built from the PIC compile objs), but I didn't seem able to make this work properly; the resulting .so seems to contain the whole library (freetype). The windows linker seems able to pick up only the required bits so the windows wheels are much smaller.

On Mon, Jul 25, 2016 at 8:55 AM, Robin Becker robin@reportlab.com wrote:
In our private readonly pypi we have 93 releases. I don't think that burden should fall on pypi. However, it's not clear to me if I should push micro releases to pypi and then remove them when another release is made. Is there a way to remove a 'release' completely?
I'm pretty sure there is no way to remove a release (at least not routinely). thi sis by design -- if someone has done something with that particular release, we want it to be reproducible.
I see the point, but it's a little be too bad -- I know I've got some releases up there that were replaced VERY soon due to a build error or some carelessness on my part :-)
Apparently, disk space is cheap enough that PyPI doesn't need to worry about it.
Are you running into any problems?
I did try to reduce my manylinux sizes by using a library of shared object
codes (ie a .a built from the PIC compile objs), but I didn't seem able to make this work properly; the resulting .so seems to contain the whole library (freetype).
is this a problem other than file sizes? I think until / if Nathanial (or someone :-) ) comes up with a standard way to make wheels of shared libs, we'll simply have to live with large binaries.
-CHB

On Jul 25, 2016, at 3:05 PM, Chris Barker chris.barker@noaa.gov wrote:
On Mon, Jul 25, 2016 at 8:55 AM, Robin Becker <robin@reportlab.com mailto:robin@reportlab.com> wrote: In our private readonly pypi we have 93 releases. I don't think that burden should fall on pypi. However, it's not clear to me if I should push micro releases to pypi and then remove them when another release is made. Is there a way to remove a 'release' completely?
I'm pretty sure there is no way to remove a release (at least not routinely). thi sis by design -- if someone has done something with that particular release, we want it to be reproducible.
Authors can delete files, releases, or projects but can never re-upload an already uploaded file, even if they delete it. It is discouraged to actually do this though (and in the future we may change it to a soft delete that just hides it from everything with the ability to restore it). It is discouraged for basically the reason you mentioned, people pin to specific versions (and sometimes specific hashes) and we don’t want to break their deployments.
I see the point, but it's a little be too bad -- I know I've got some releases up there that were replaced VERY soon due to a build error or some carelessness on my part :-)
Apparently, disk space is cheap enough that PyPI doesn't need to worry about it.
Disk space is super cheap. We’re currently using Amazon S3 to store our files, and the storage portion of our “bill” there is something like $10/month for all of PyPI (out of a total “cost” of ~$35,000/month). Almost all of our “cost” for PyPI as a whole comes from bandwidth used not from storage.
— Donald Stufft

On Tue, Jul 26, 2016 at 6:16 AM, Donald Stufft donald@stufft.io wrote:
Disk space is super cheap. We’re currently using Amazon S3 to store our files, and the storage portion of our “bill” there is something like $10/month for all of PyPI (out of a total “cost” of ~$35,000/month). Almost all of our “cost” for PyPI as a whole comes from bandwidth used not from storage.
Does anyone mirror all of PyPI? If so, "storage" suddenly also means "bandwidth".
ChrisA

On Jul 25, 2016, at 5:57 PM, Chris Angelico rosuav@gmail.com wrote:
On Tue, Jul 26, 2016 at 6:16 AM, Donald Stufft donald@stufft.io wrote:
Disk space is super cheap. We’re currently using Amazon S3 to store our files, and the storage portion of our “bill” there is something like $10/month for all of PyPI (out of a total “cost” of ~$35,000/month). Almost all of our “cost” for PyPI as a whole comes from bandwidth used not from storage.
Does anyone mirror all of PyPI? If so, "storage" suddenly also means "bandwidth”.
Yes folks do mirror all of PyPI, but it’s not as simple as storage == bandwidth. The price of the bandwidth is paid generally when the file is uploaded so deleting doesn’t reduce the bandwidth demands of existing mirrors. It *does* increase the bandwidth demands of a brand new mirror, but a single full mirror represents 0.089% of the total monthly bandwidth of PyPI and there are no indications that there are significant numbers of new mirrors being added regularly to where it would even matter.
— Donald Stufft

On Tue, Jul 26, 2016 at 8:03 AM, Donald Stufft donald@stufft.io wrote:
On Jul 25, 2016, at 5:57 PM, Chris Angelico rosuav@gmail.com wrote:
On Tue, Jul 26, 2016 at 6:16 AM, Donald Stufft donald@stufft.io wrote:
Disk space is super cheap. We’re currently using Amazon S3 to store our files, and the storage portion of our “bill” there is something like $10/month for all of PyPI (out of a total “cost” of ~$35,000/month). Almost all of our “cost” for PyPI as a whole comes from bandwidth used not from storage.
Does anyone mirror all of PyPI? If so, "storage" suddenly also means "bandwidth”.
Yes folks do mirror all of PyPI, but it’s not as simple as storage == bandwidth. The price of the bandwidth is paid generally when the file is uploaded so deleting doesn’t reduce the bandwidth demands of existing mirrors. It *does* increase the bandwidth demands of a brand new mirror, but a single full mirror represents 0.089% of the total monthly bandwidth of PyPI and there are no indications that there are significant numbers of new mirrors being added regularly to where it would even matter.
Good stats, thanks.
ChrisA
participants (5)
-
Chris Angelico
-
Chris Barker
-
Daniel Holth
-
Donald Stufft
-
Robin Becker