I know it's been debated here whether there should be some kind of filtering
on uploaded packages on PyPI, but today someone, either an automated tool or a
silly guy, started to upload dozens of "Xxx 0.1.0" where "Xxx" is some
"surname", here is latest variant: https://pypi.python.org/pypi/Lykov/0.1.0
Is there something that can/should be done to stop it?
nickname: Lele Gaifax | Quando vivrò di quello che ho pensato ieri
real: Emanuele Gaifas | comincerò ad aver paura di chi mi copia.
lele(a)metapensiero.it | -- Fortunato Depero, 1929.
They host projects on github and bitbucket, and discuss issues on the pypa-dev and distutils-sig mailing lists.
I don't know where to go to.
... Choices .... too many choices .....
... giving me the feeling of uncertainty. I am feeling fear.
I am stupid and missing a guiding hand which gives me simple straight forward step by step instruction.
Please help me.
I am looking for feedback for my personal programming guidelines:
Dear Nick and other distutils listeners,
Nick wrote this about seven months ago:
I love Python and I use it daily.
On the other hand there are other interesting programming languages out there.
Why not do "thinking in sets" here and see python just as one item in the list of a languages?
Let's dream: All languages should be supported in the ever best packaging solution of the future.
What do you think?
Thomas Guettler http://www.thomas-guettler.de/
Following up on some IRC discussion with other folks:
There is precedent (Plone) for PyPI trove classifiers corresponding to
particular versions of a framework. So I'd like to get feedback on the idea
of expanding that, particularly in the case of Django.
The rationale here is that the ecosystem of Django-related packages is
quite large, but -- as I know all too well from a project I'm working on
literally at this moment -- it can be difficult to ensure that all of one's
dependencies are compatible with the version of Django one happens to be
Adding trove classifier support at the level of individual versions of
Django would, I think, greatly simplify this: tools could easily analyze
which packages are compatible with an end user's chosen version, there'd be
far less manual guesswork, etc., and the rate of creation of new
classifiers would be relatively low (we tend to have one X.Y release/year
or thereabouts, and that's the level of granularity needed).
Assuming there's consensus around the idea of doing this, what would be the
correct procedure for getting such classifiers set up and maintained?
Today, I ran into trouble working with an old project that had six pinned
to version 1.1.0. The install failed because buildout tried to install it
as 1.10.0 and failed because 1.10.0 was already installed.
The problem arose because six's setup.py imports setuptools and then
imports six to get __version__. When Buildout runs a setup script, it puts
it's own path ahead of the distribution, so the setup script would get
whatever version buildout was running. IMO, this is a six bug, but wait,
I tried installing a pinned version with pip, using ``pip install -U
six==1.9.0``. This worked. I then tried with version 1.1.0, and this
failed, because setuptools wouldn't work with 1.1.0. Pip puts the
distribution ahead of it's own path when running a setup script. setuptools
requires six >= 1.6, so pip can't be used to install pinned versions (in
requirements.txt) earlier than 1.6. Six is a wildly popular package and has
been around for a long time. Earlier pins are likely.
I raise this here in the broader context of managing clashes between
setuptools requirements and requirements of libraries (and applications
using them) it's installing. I think Buildout's approach of putting it's
path first is better, although it was more painful in this instance.
I look forward to a time when we don't run scripts at install time (or are
at least wildly less likely to).
Buildout is growing wheel support. It should have provided a work around,
- I happened to be trying to install a 1.1 pin and the earliest six
wheel is for 1..
- I tried installing six 1.8. Buildout's wheel extension depended on
pip, which depends on setuptools and six. When buildout tries to load the
extension, it tries to get the extension's dependencies, which includes six
while honoring the version pin, which means it has to install six before it
has wheel support. Obviously, this is Buildout's problem, but it
illustrates the complexity that arises when packaging dependencies overlap
dependencies of packages being managed.
IDK what the answer is. I'm just (re-)raising the issue and providing a
data point. I suspect that packaging tools should manage their own
dependencies independently. That's what was happening until recently IIUC
for the pypa tools through vendoring. I didn't like vendoring, but I'm
starting to see the wisdom of it. :)