We use gunicorn as our webserver and we use it to do zero downtime
deployments. The way it does this is by having a master process that forks
children. This works perfectly when we deploy as editable installations
(pip install -e) but doesn't work when we install sdists.
The reason this doesn't work with sdists is because the .egg-info directory
is named by the version, for example, if we deploy AnWeb-1.0 and it'll
give a path like this:
Then we release AnWeb-1.5, it'll give us:
So when a new worker is forked, the master already has sys.path loaded and
its going to check for sites-packages/AnWeb-1.0.egg-info/entry_points.txt
which will fail and the workers die.
What I'm wondering is if I can control this somehow to get a non-versioned
egg-info installed from an sdist? Or do you recommend always using
I'm in the process of developing an automated solution to allow users
to quickly set up a Windows box so that it can be used to compile
Python extensions and build wheels. While it can obviously be used by
Windows developers who want to quickly set up a box, my main target is
Unix developers who want to provide wheels for Windows users.
To that end, I'd like to get an idea of what sort of access to Windows
a typical Unix developer would have. I'm particularly interested in
whether Windows XP/Vista is still in use, and whether you're likely to
already have Python and/or any development tools installed. Ideally, a
clean Windows 7 or later virtual machine is the best environment, but
I don't know if it's reasonable to assume that.
Another alternative is to have an Amazon EC2 AMI prebuilt, and users
can just create an instance based on it. That seems pretty easy to do
from my perspective but I don't know if the connectivity process
(remote desktop) is a problem for Unix developers.
Any feedback would be extremely useful. I'm at a point where I can
pretty easily set up any of these options, but if they don't turn out
to actually be usable by the target audience, it's a bit of a waste of
By popular demand, the development version of wheel now includes the
tag for packages that include a binary but don't need a particular
Python or ABI - for example, cffi or ctypes packages using dlopen.
It's just after the implementation-specific tags. There is not a flag
to get bdist_wheel to tag packages this way; you would have to rename
them to get wheels bearing the new tag.
The complete list on PyPy is currently
[('pp27', 'none', 'linux_x86_64'),
('pp27', 'none', 'any'),
('pp2', 'none', 'any'),
('pp26', 'none', 'any'),
('pp25', 'none', 'any'),
('pp24', 'none', 'any'),
('pp23', 'none', 'any'),
('pp22', 'none', 'any'),
('pp21', 'none', 'any'),
('pp20', 'none', 'any'),
('py2', 'none', 'linux_x86_64'),
('py27', 'none', 'any'),
('py2', 'none', 'any'),
('py26', 'none', 'any'),
('py25', 'none', 'any'),
('py24', 'none', 'any'),
('py23', 'none', 'any'),
('py22', 'none', 'any'),
('py21', 'none', 'any'),
('py20', 'none', 'any')]
I've also merged a contribution from Benedikt Morbach that sorts the
generated metadata to make it deterministic, and have tagged some of
the previous released. If there are no complaints I'll probably
release as 0.25.0 in a few days.
Right now, PyPI provides MD5 hashes for packages, which is used by pip to
for corruption in transit. I'd like to propose we replace MD5 with SHA256 for
PyPi, and move to deprecate MD5 support in pip and setuptools.
Why should we do this? MD5 is broken. Collision resistance is totally 100%
uselessly busted, and pre-image resistance is mathematically broken; practical
attacks aren't known publicly, but it's reasonable to assume private attacks
are strong because (sing it with me): "Attacks only get better".
So MD5 doesn't provide the guarantees one might expect; SHA256 is not
these ways. But it's not just not providing value, it's actively causing
problems: some machines, such as those with packages compiled to meet
FIPS-140-2 do not have MD5 available at all, and so pip's verification raises
While one might be inclined to find a way to silently support both machine
configurations, I'd like to instead say we should abhor any additional
configuration (whether user supplied or auto-detected) and instead simply
upgrade the hashes offered by PyPI, and begin the deprecation process for
There are currently 60 packages on PyPI which are *not* hosted on PyPI, but
have MD5 hashes there. For these packages we could download the package,
the MD5 hash, and then upgrade what PyPI stores to be SHA256.
I hope some windows expert can assist me with a production problem. We support a
user using windows who reports problems concerning missing attributes. Using
GotoMeeting we inspected the file together and see that the attribute should be
present. I asked them to zip up the reportlab folder from site-packages and the
module that is in the zip does not have the attribute. Puzzlement!
Searching reveals this
> Due to security features introduced with Windows Vista (UAC) any
> non-Administrator program that tries to write to protected locations
> such as "Program Files" will get their writes caught and redirected to
> an alternative "user friendly" location.
> The program that made the file will be able to see the file, but most
> other programs will not.
> Files written to "protected locations" will end up in a parallel file
> structure under C:\Users\[username]\AppData\Local\VirtualStore, but
> will appear to the program that created them as if actually in the
> intended location.
so I'm wondering if this is such an issue. The user has a 4 year old version of
reportlab and doesn't wish to upgrade. In the past, when they had XP, we used
to support minor fixes by modifying the modules and having them overwrite the
installed versions in site-packages files.
Clearly if some kind of security issues are in place which causes this kind of
double file problem, then overwriting the python will not always work. In
addition it may be that users can't write the pyc or something.
Does anyone here have experience of these issues? Will I be forced to maintain
patched installers etc etc? Is there some trick like having an administrator
write the files?
Hi Donald, all,
i noticed that for several packages daily download numbers are only a
tenth or so of what they used to be. This occurs since about a couple
of days or a week ago. Any known reason?
New submission from Sorin Sbarnea:
easy_install --quiet --dry-run -U packname
would still output at least the:
This prevents us from properly running a quite update from crontab, where we are supposed to have output only if there are errors.
title: easy_install --quiet --dry-run -U package is not really quiet
Setuptools tracker <setuptools(a)bugs.python.org>
I see that nowadays trying to access http://pypi.python.org (including
any URL beneath it) unconditionally redirects to
https://pypi.python.org. I'm trying to access it using system which
doesn't have SSL support and cannot easily have (embedded, size
constraints). Is there any way to access PyPI metadata/tarballs using
[redirecting to list]
On 11/05/2014 10:30 AM, Ian Cordasco wrote:
> On Nov 5, 2014 12:24 PM, "Ethan Furman" wrote:
>> I have four packages on PyPI: antipathy, dbf, pandaemonium, and scription.
>> `pip install --upgrade` works for three of them, but for scription it continually grabs an older release (0.53, I think).
>> Any ideas what might be wrong?
> Is the newest version a pre-release?
No, but that was the clue I need, thanks!
0.53.0 is a higher version than 0.7.0. Doh.
Changing that to 0.70.1 now...
Thanks for the help!