I've got two projects: mynamespace.myprojectA and mynamespace.myprojectB
myprojectB depends on myprojectA. I'm using setuptools 0.6c8 to manage both
projects.
Both projects are registered using 'setup develop'. Both projects are
accessible from an interactive interpreter:
PS C:\Users\me\projects> python
Python 2.5.2 (r252:60911, Feb 21 2008, 13:11:45) [MSC v.1310 32 bit (Intel)]
on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import mynamespace.myprojectA
>>> import mynamespace.myprojectB
>>> from mynamespace.myprojectA import mymoduleZ
However, when I run 'setup test' in myprojectB, the tests fail with
File ".mymoduleZ.py", line NNN, in [some context]
from mynamespace.myprojectA.mymoduleZ import MyClassC
ImportError: No module named myprojectA.mymoduleZ
In setup.py, the test_suite is nose.collector.
I searched and couldn't find anyone else with this problem. Is this a
supported configuration? Is there something I can do to make tests work
with interdependent projects with the same root namespace?
If there's not something obvious I should be doing differently, I'm happy to
put together a minimal test case that reproduces the problem. Any
suggestions are appreciated.
Sincerely,
Jason R. Coombs
Hello everyone,
I am a research programmer at the NYU School of Engineering. My colleagues
(Trishank Kuppusamy and Justin Cappos) and I are requesting community
feedback on our proposal, "Surviving a Compromise of PyPI." The two-stage
proposal can be reviewed online at:
PEP 458
http://legacy.python.org/dev/peps/pep-0458/
PEP 480
http://legacy.python.org/dev/peps/pep-0480/
Summary of the Proposal:
"Surviving a Compromise of PyPI" proposes how the Python Package Index
(PyPI) can be amended to better protect end users from altered or malicious
packages, and to minimize the extent of PyPI compromises against affected
users. The proposed integration allows package managers such as pip to be
more secure against various types of security attacks on PyPI and defend
end users from attackers responding to package requests. Specifically,
these PEPs describe how PyPI processes should be adapted to generate and
incorporate repository metadata, which are signed text files that describe
the packages and metadata available on PyPI. Package managers request
(along with the packages) the metadata on PyPI to verify the authenticity
of packages before they are installed. The changes to PyPI and tools will
be minimal by leveraging a library, The Update Framework
<https://github.com/theupdateframework/tuf>, that generates and
transparently validates the relevant metadata.
The first stage of the proposal (PEP 458
<http://legacy.python.org/dev/peps/pep-0458/>) uses a basic security model
that supports verification of PyPI packages signed with cryptographic keys
stored on PyPI, requires no action from developers and end users, and
protects against malicious CDNs and public mirrors. To support continuous
delivery of uploaded packages, PyPI administrators sign for uploaded
packages with an online key stored on PyPI infrastructure. This level of
security prevents packages from being accidentally or deliberately tampered
with by a mirror or a CDN because the mirror or CDN will not have any of
the keys required to sign for projects.
The second stage of the proposal (PEP 480
<http://legacy.python.org/dev/peps/pep-0480/>) is an extension to the basic
security model (discussed in PEP 458) that supports end-to-end verification
of signed packages. End-to-end signing allows both PyPI and developers to
sign for the packages that are downloaded by end users. If the PyPI
infrastructure were to be compromised, attackers would be unable to serve
malicious versions of these packages without access to the project's
developer key. As in PEP 458, no additional action is required by end
users. However, PyPI administrators will need to periodically (perhaps
every few months) sign metadata with an offline key. PEP 480 also proposes
an easy-to-use key management solution for developers, how to interface
with a potential build farm on PyPI infrastructure, and discusses the
security benefits of end-to-end signing. The second stage of the proposal
simultaneously supports real-time project registration and developer
signatures, and when configured to maximize security on PyPI, less than 1%
of end users will be at risk even if an attacker controls PyPI and goes
undetected for a month.
We thank Nick Coghlan and Donald Stufft for their valuable contributions,
and Giovanni Bajo and Anatoly Techtonik for their feedback.
Thanks,
PEP 458 & 480 authors.
Is it currently possible to upgrade dependencies as well when
upgrading a packages? If not, this would be a really nice feature to
add to easy_install. Maybe a call like:
easy_install --upgrade --upgrade-deps Package
Obviously it makes sense to leave this off by default.
Thanks,
Charlie
Elvelind Grandin reported a problem with the "develop" command that turned
out to be a flaw in its --find-links support. Specifically, it wasn't ever
processing the links. :)
In the process of fixing it, I wound up cleaning up an annoying (to me, at
least) quirk of the previous workings of --find-links. It used to be that
find-links would always be processed first, no matter what, even if you
were doing a completely local operation. This would've been especially
annoying if it carried over into "develop", so I made some changes.
Now, if an item passed to --find-links is local (a filename or file: URL),
or a direct link to an egg or other distribution, it is indexed
immediately. Remote URLs are now only retrieved if a dependency can't be
resolved locally, or if you use the -U or --upgrade options (this goes for
"develop" too).
Note that this is a behavior change for easy_install, which was effectively
treating --find-links as though you'd specified --upgrade in certain
cases. So, if you're used to getting upgrades downloaded as a result of
using --find-links, please note that this will no longer
happen. EasyInstall will now *only* go online if a dependency can't be
resolved locally, if -U or --upgrade is used, or if you provided suitable
direct URLs via an argument or --find-links, or via a link in a local .html
file.
Hi,
I´m trying to do "easy_install setuptools==dev" in a windows box but have
an error with svn command.
"""... 'svn' is not recognized as an internal or external command, ..."""
I have installed TortoiseSVN but it seems that don't have command line, is
all with the Explorer.
Can you tell me what client I may use in windows?
Thanks for all,
Helio Pereira
Sharky @ PTNet
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
I'm new to distutils/and the egg system. I have not researched the
archives as much as I should, so please excuse me if this is familiar
territory...
Q: Is it possible for dependency checks to work for non pipy deployed eggs?
- A.egg
- B.egg (setup_requires and install_requires A.egg)
- email A and B eggs to friend
- tell friend to : easy_install B
- but B.egg does not install A...in fact, in my test case, it fails b/c
it cannot find A in pipy
- in this example, what happens if friend does not have net connection
on target machine? how can they install from the eggs alone and get the
dep checking to work?
Q: How does my friend run the unittests on the egg that they have just
installed? As I am developing, I frequently run the unittests and after
every bdist_egg build, I also run the unit tests that way, too. How can
the consumer do the same?
Q: One of my apps is a command line app, say foo.py is a command line
app that takes arg1, arg2,... How can the user run this app when it is
encased in an egg?
Q: How do I define my own pipy? While I like the idea of deploying there
for the overall community, I don't want to send my half baked code into
the wild. But I'd like to try the general functionality of having a
deploy site for my beta customers. How can I set this up?
Thanks in advance.
- -Todd
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.1 (GNU/Linux)
iD8DBQFDl9Hfz6uXX4lQc/URAnY4AJ47DFHL/cpEltmqoUyGqzhsK5FjRACgt01v
akc3Wh3IOXVv2jiS8TszhuE=
=Rv/x
-----END PGP SIGNATURE-----
Is it possible for a package to depend on one of several packages, with
the user having the option to pick? For example, my package P might
depend on package A, plus either package B or package C.
I know I could (ab)use extras_require, but in my case this is a real
dependency rather than an optional extra. The other option I have is to
just pick one of the dependencies (B or C in my example above), and
thereby force people to install it even though it's not strictly required.
Neither option seems particularly attractive.
John
The comma removed by the patch below implies that users working from a
Subversion checkout should not use EasyInstall to periodically fetch the
latest version. That's not what you intended to say, I assume?
http://peak.telecommunity.com/DevCenter/setuptools
You should also inform your users of the need to run this command, if they
-are working from a Subversion checkout, rather than using EasyInstall to
+are working from a Subversion checkout rather than using EasyInstall to
periodically fetch the latest version.
Hmm, probably much better than the patch above:
-You should also inform your users of the need to run this command, if they
-are working from a Subversion checkout, rather than using EasyInstall to
-periodically fetch the latest version.
+If your users are working from a Subversion checkout rather than using
+EasyInstall, you should also inform them of the need to run this command
+to periodically fetch the latest version.
John
At 05:43 AM 5/18/2004, David Handy wrote:
>On Tue, 18 May 2004, Floris van der Tak wrote:
>
>> my attempt to install the distutils on suse 9.0 (which provides python
>> 2.3) fails:
>
>Why are you installing distutils on a system that already has Python 2.3
>installed? Python 2.3 normally comes with distutils as part of the Python
>standard library.
>
>Did suse distribute a version of Python that had distutils removed?
It should come from python-devel
Hi again --
[cc'd to Paul Dubois: you said you weren't following the distutils sig
anymore, but this directly concerns NumPy and I'd like to get your
input!]
here's that sample setup.py for NumPy. See below for discussion (and
questions!).
------------------------------------------------------------------------
#!/usr/bin/env python
# Setup script example for building the Numeric extension to Python.
# This does sucessfully compile all the .dlls. Nothing happens
# with the .py files currently.
# Move this file to the Numerical directory of the LLNL numpy
# distribution and run as:
# python numpysetup.py --verbose build_ext
#
# created 1999/08 Perry Stoll
__rcsid__ = "$Id: numpysetup.py,v 1.1 1999/09/12 20:42:48 gward Exp $"
from distutils.core import setup
setup (name = "numerical",
version = "0.01",
description = "Numerical Extension to Python",
url = "http://www.python.org/sigs/matrix-sig/",
ext_modules = [ ( '_numpy', { 'sources' : [ 'Src/_numpymodule.c',
'Src/arrayobject.c',
'Src/ufuncobject.c'
],
'include_dirs' : ['./Include'],
'def_file' : 'Src/numpy.def' }
),
( 'multiarray', { 'sources' : ['Src/multiarraymodule.c'],
'include_dirs' : ['./Include'],
'def_file': 'Src/multiarray.def'
}
),
( 'umath', { 'sources': ['Src/umathmodule.c'],
'include_dirs' : ['./Include'],
'def_file' : 'Src/umath.def' }
),
( 'fftpack', { 'sources': ['Src/fftpackmodule.c', 'Src/fftpack.c'],
'include_dirs' : ['./Include'],
'def_file' : 'Src/fftpack.def' }
),
( 'lapack_lite', { 'sources' : [ 'Src/lapack_litemodule.c',
'Src/dlapack_lite.c',
'Src/zlapack_lite.c',
'Src/blas_lite.c',
'Src/f2c_lite.c'
],
'include_dirs' : ['./Include'],
'def_file' : 'Src/lapack_lite.def' }
),
( 'ranlib', { 'sources': ['Src/ranlibmodule.c',
'Src/ranlib.c',
'Src/com.c',
'Src/linpack.c',
],
'include_dirs' : ['./Include'],
'def_file' : 'Src/ranlib.def' }
),
]
)
------------------------------------------------------------------------
First, what d'you think? Too clunky and verbose? Too much information
for each extension? I kind of think so, but I'm not sure how to reduce
it elegantly. Right now, the internal data structures needed to compile
a module are pretty obviously exposed: is this a good thing? Or should
there be some more compact form for setup.py that will be expanded later
into the full glory we see above?
I've already made one small step towards reducing the amount of cruft by
factoring 'include_dirs' out and supplying it directly as a parameter to
'setup()'. (But that needs code not in the CVS archive yet, so I've
left the sample setup.py the same for now.)
The next thing I'd like to do is get that damn "def_file" out of there.
To support it in MSVCCompiler, there's already an ugly hack that
unnecessarily affects both the UnixCCompiler and CCompiler classes, and
I want to get rid of that. (I refer to passing the 'build_info'
dictionary into the compiler classes, if you're familiar with the code
-- that dictionary is part of the Distutils extension-building system,
and should not propagate into the more general compiler classes.)
But I don't want to give these weird "def file" things standing on the
order of source files, object files, libraries, etc., because they seem
to me to be a bizarre artifact of one particular compiler, rather than
something present in a wide range of C/C++ compilers.
Based on the NumPy model, it seems like there's a not-too-kludgy way to
handle this problem. Namely:
if building extension "foo":
if file "foo.def" found in same directory as "foo.c"
add "/def:foo.def" to MSVC command line
this will of course require some platform-specific code in the build_ext
command class, but I figured that was coming eventually, so why put it
off? ;-)
To make this hack work with NumPy, one change would be necessary: rename
Src/numpy.def to Src/_numpy.def to match Src/_numpy.c, which implements
the _numpy module. Would this be too much to ask of NumPy? (Paul?)
What about other module distributions that support MSVC++ and thus ship
with "def" files? Could they be made to accomodate this scheme?
Thanks for your feedback --
Greg
--
Greg Ward - software developer gward(a)cnri.reston.va.us
Corporation for National Research Initiatives
1895 Preston White Drive voice: +1-703-620-8990
Reston, Virginia, USA 20191-5434 fax: +1-703-620-0913