I have four packages on PyPI: antipathy, dbf, pandaemonium, and scription.
`pip install --upgrade` works for three of them, but for scription it continually grabs an older release (0.53, I think).
Any ideas what might be wrong?
I just saw another topic posted,
which raises question similar to which I had in mind for some time. To
not hijack that thread, I open a new one, but comment to a message from
So, my case: support different platforms in one distribution package.
To give more concrete example, let it be some module implemented using
ctypes, with completely different implementation for Linux, MacOSX, and
Windows. I'd also like to avoid installing files unneeded for
particular platform (a usecase applies to MicroPython
http://micropython.org/ , where there can be simply not enough storage
space to install cruft).
On Mon, 27 Oct 2014 14:04:38 +0000
Paul Moore <p.f.moore@...> wrote:
> For a source distribution, you could play clever games in setup.py to
> put the right file in place, with the right name. But that's messy and
> it means that if you distribute wheels (not that there's much point in
> doing so) you need separate wheels for 2.6-, 2.7 and 3.3+.
Ok, so are there guidelines, best practices or at least example(s) how
to do that? I pretty much would like to avoid inventing my own "clever
games" to achieve that.
> Alternatively, you could distribute all 3 files, as
> - __init__.py
> - dbf_26.py
> - dbf_27.py
> - dbf_3.py
> Then in __init__.py do
> if sys.version_info == 3:
> from .dbf_3 import *
> elif sys.version_info[:2] == (2, 7):
> from .dbf_27 import *
> from .dbf_26 import *
For our MicroPython case, we would like to avoid this due to
aforementioned reasons. We could probably post-process installation dir
after pip to remove unneeded files, but that sounds like hack - we'd
rather do it fully on distribution package level, and target to install
just a single source file per module, and avoid dispatcher like above
(again, for efficiency reasons).
There's also another issue - of supporting "cross-installs". Not all
MicroPython targets support running pip natively, so it instead runs on
a host computer, but would need be instructed to select source
variants for a particular target platform.
Any hints/pointers on how to achieve this - preferrably in "standard"
way - are appreciated!
is there any reason why would someone enclose values in setup.py file
of setup argument into nested list? Why I ask?
Recently I have received bug report which is affected by this feature and
before I proceed to fix this I would like to know the reason why would
someone do that and what perks does it bring (my plan is to flat this
46 'console_scripts': [
47 ['pkginfo = pkginfo.commandline:main']
I couldn't find anything about that in setuptools docs so I hope
it is ok to ask here.
Since buildout 2.2.3 is out, I've been getting mails out of quite some
Those jobs always run "python bootstrap.py" followed by "bin/buildout".
We always pin buildout, so most of them are at version 2.2.1
$ python bootstrap.py
Setting socket time out to 1 seconds.
Setting socket time out to 1 seconds.
Error: There is a version conflict.
We already have: zc.buildout 2.2.3
"python bootstrap.py" generates a bin/buildout with a 2.2.3 version in
it. And running bin/buildout doesn't want to downgrade to the specified
The bootstrap.py I'm using is
I've seen some other problems with versions when there's a globally
installed version already available at a higher version. A colleague
installed ansible globally on the jenkins machine (...) so there's a new
jinja2 in usr/local/lib/pythonsomething. The jinja2 in buildout is
pinned to a lower version, but buildout refuses to downgrade it and
fails with an "I already have that new version".
So... I haven't totally tracked it down yet, but I'm throwing it out on
the list here in case others have seen it too.
I'll continue debugging after lunch :-)
Reinout van Rees http://reinout.vanrees.org/
"Learning history by destroying artifacts is a time-honored atrocity"