PEP 426 addresses build requirements for distributions of Python code,
but doesn't directly help with development environments.
It seems to me that if we help development environments, that would be
nice - and any explicit metadata there can obviously be reflected into
PEP-426 data in future.
For context, the main use I have for setup_requires these days is
projects with a version contained within the project, and for the use
of pbr in openstack (and some other git hosted) projects.
Consider e.g. unittest2, which has its version information in one
place inside the package; but setup imports unittest2 to get at that,
so all the dependencies become setup_requires entries :(. I may change
that to exec which Donald suggested on IRC [I'd been pondering
something similar for a bit - but was thinking of putting e.g.a json
file in the package and then reading that for version data].
testtools has a similar bunch of logic in setup.py.
The openstack projects have a nice solution I think, which is that
they write the egg metadata file and then read that back - both at
runtime via pbr helpers and at build time when pbr takes over the
build.
The problem with that, of course, is that pbr then becomes a
setup_requires itself.
So, I'm wondering if we can do something fairly modest to make
setup_requires usage nicer for devs, who won't benefit from PEP-426
work, but share all the same issues. E.g. pip install git://... / pip
install filepath / pip install -e filepath should be able to figure
out the setup_requires and have things Just Work.
Something like:
- teach pip to read setup_requires from setup.cfg
setuptools doesn't need to change - it will still try to check its own
setup_requires, and if an older pip had been used, that will trigger
easy_install as it does currently. There's a small amount of duplicate
work in the double checking, but thats tolerable IMO.
We could go further and also teach setuptools how to do that, e.g. you'd put
setup_requires='setuptools>someX' in setup.py
and your real setup_requirements in setup.cfg.
That would be better as it would avoid double-handling, but we'd need
some complex mojo to make things work when setuptools decides to
self-upgrade :( - so I'm inclined to stay with the bare bones solution
for now.
Thoughts?
-Rob
--
Robert Collins
Robert Collins
For context, the main use I have for setup_requires these days is projects with a version contained within the project, and for the use of pbr in openstack (and some other git hosted) projects.
[…]
The openstack projects have a nice solution I think, which is that they write the egg metadata file and then read that back - both at runtime via pbr helpers and at build time when pbr takes over the build.
I'm not using ‘pbr’, but yes, a big bundle of “create an egg-info file and read it back” custom Setuptools code was the clumsy solution I ended up with for this in ‘python-daemon’ 2.x. If that boilerplate could be removed, and “this is a dependency for build actions only” could just work for all users, I would be quite happy. -- \ “The problem with television is that the people must sit and | `\ keep their eyes glued on a screen: the average American family | _o__) hasn't time for it.” —_The New York Times_, 1939 | Ben Finney
On 16 March 2015 at 11:05, Robert Collins
PEP 426 addresses build requirements for distributions of Python code, but doesn't directly help with development environments.
It's supposed to, but updating the relevant section of the PEP has been lingering on my todo list for a while now. Short version is that you'll be able to do "pip install package[-:self:]" in order to get all the build, dev and runtime dependencies without installing the package itself. It hasn't been a priority since PEP 440 was the focus of the last pip/setuptools release, and Warehouse & TUF have been higher priority since then. So I agree it would be worthwhile to figure out an interim improvement, but don't have a strong opinion on what that should look like. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On Mar 15, 2015, at 9:05 PM, Robert Collins
wrote: PEP 426 addresses build requirements for distributions of Python code, but doesn't directly help with development environments.
It seems to me that if we help development environments, that would be nice - and any explicit metadata there can obviously be reflected into PEP-426 data in future.
For context, the main use I have for setup_requires these days is projects with a version contained within the project, and for the use of pbr in openstack (and some other git hosted) projects.
Consider e.g. unittest2, which has its version information in one place inside the package; but setup imports unittest2 to get at that, so all the dependencies become setup_requires entries :(. I may change that to exec which Donald suggested on IRC [I'd been pondering something similar for a bit - but was thinking of putting e.g.a json file in the package and then reading that for version data].
testtools has a similar bunch of logic in setup.py.
The openstack projects have a nice solution I think, which is that they write the egg metadata file and then read that back - both at runtime via pbr helpers and at build time when pbr takes over the build.
The problem with that, of course, is that pbr then becomes a setup_requires itself.
So, I'm wondering if we can do something fairly modest to make setup_requires usage nicer for devs, who won't benefit from PEP-426 work, but share all the same issues. E.g. pip install git://... / pip install filepath / pip install -e filepath should be able to figure out the setup_requires and have things Just Work.
Something like: - teach pip to read setup_requires from setup.cfg
setuptools doesn't need to change - it will still try to check its own setup_requires, and if an older pip had been used, that will trigger easy_install as it does currently. There's a small amount of duplicate work in the double checking, but thats tolerable IMO.
We could go further and also teach setuptools how to do that, e.g. you'd put setup_requires='setuptools>someX' in setup.py and your real setup_requirements in setup.cfg.
That would be better as it would avoid double-handling, but we'd need some complex mojo to make things work when setuptools decides to self-upgrade :( - so I'm inclined to stay with the bare bones solution for now.
Thoughts?
I've been thinking about this proposal this morning, and my primary question is what exactly is the pain that is being caused right now, and how does this proposal help it? Is the pain that setuptools is doing the installation instead of pip? Is that pain that the dependencies are being installed into a .eggs directory instead of globally? Is it something else? I'm hesitant to want to add another psuedo standard ontop of the pile of implementation defined psuedo standards we already have, especially without fully understanding what the underlying pain point actually is and how the proposal addresses it. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
We could support this syntax right now. It's so simple. Don't deride
it as a pseudo standard, turn it into an actual standard and praise it
as something practical that will not take years to implement. Then
after those years have passed and the new PEP actually works and has a
distutils replacement to drive it, deprecate the old standard.
If you can come up with something better that can ship before 2016, by
all means.
[metadata]
setup-requires = cffi
pip
pycparser >= 2.10
https://bitbucket.org/dholth/setup-requires
On Mon, Mar 16, 2015 at 11:06 AM, Donald Stufft
On Mar 15, 2015, at 9:05 PM, Robert Collins
wrote: PEP 426 addresses build requirements for distributions of Python code, but doesn't directly help with development environments.
It seems to me that if we help development environments, that would be nice - and any explicit metadata there can obviously be reflected into PEP-426 data in future.
For context, the main use I have for setup_requires these days is projects with a version contained within the project, and for the use of pbr in openstack (and some other git hosted) projects.
Consider e.g. unittest2, which has its version information in one place inside the package; but setup imports unittest2 to get at that, so all the dependencies become setup_requires entries :(. I may change that to exec which Donald suggested on IRC [I'd been pondering something similar for a bit - but was thinking of putting e.g.a json file in the package and then reading that for version data].
testtools has a similar bunch of logic in setup.py.
The openstack projects have a nice solution I think, which is that they write the egg metadata file and then read that back - both at runtime via pbr helpers and at build time when pbr takes over the build.
The problem with that, of course, is that pbr then becomes a setup_requires itself.
So, I'm wondering if we can do something fairly modest to make setup_requires usage nicer for devs, who won't benefit from PEP-426 work, but share all the same issues. E.g. pip install git://... / pip install filepath / pip install -e filepath should be able to figure out the setup_requires and have things Just Work.
Something like: - teach pip to read setup_requires from setup.cfg
setuptools doesn't need to change - it will still try to check its own setup_requires, and if an older pip had been used, that will trigger easy_install as it does currently. There's a small amount of duplicate work in the double checking, but thats tolerable IMO.
We could go further and also teach setuptools how to do that, e.g. you'd put setup_requires='setuptools>someX' in setup.py and your real setup_requirements in setup.cfg.
That would be better as it would avoid double-handling, but we'd need some complex mojo to make things work when setuptools decides to self-upgrade :( - so I'm inclined to stay with the bare bones solution for now.
Thoughts?
I've been thinking about this proposal this morning, and my primary question is what exactly is the pain that is being caused right now, and how does this proposal help it? Is the pain that setuptools is doing the installation instead of pip? Is that pain that the dependencies are being installed into a .eggs directory instead of globally? Is it something else?
I'm hesitant to want to add another psuedo standard ontop of the pile of implementation defined psuedo standards we already have, especially without fully understanding what the underlying pain point actually is and how the proposal addresses it.
--- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
On Mar 16, 2015, at 11:24 AM, Daniel Holth
wrote: We could support this syntax right now. It's so simple. Don't deride it as a pseudo standard, turn it into an actual standard and praise it as something practical that will not take years to implement. Then after those years have passed and the new PEP actually works and has a distutils replacement to drive it, deprecate the old standard.
If you can come up with something better that can ship before 2016, by all means.
[metadata] setup-requires = cffi pip pycparser >= 2.10
It is a psuedo standard unless it is backed by a defined standard. That's not a derision, it's just a fact. The first step is to determine *what* the problem is that it's actually attempting to solve. That's not clear to me currently other than some vague statements about pain, but what pain? What's actually occuring and how does this address those problems? After figuring out what the actual problem is, we can look at the proposed solution and see how well it actually solves that problem, if there is maybe a better solution to the problem, and if the benefits outweigh the costs or not. The ease of implementation is not the only factor in deciding if something is a good idea or not. We have to take into account forwards and backwards compatiblity. If we implement it and people start to depend on it then it's something that's going to have to exist forever, and any new installer is going to have to replicate that behavior. If people don't depend on it then implementing it was a waste of time and effort. For instance, if the problem is "when setuptools does the install, then things get installed differently, with different options, SSL certs, proxies, etc" then I think a better solution is that pip does terrible hacks in order to forcibly take control of setup_requires from setuptools and installs them into a temporary directory (or something like that). That is something that would require no changes on the part of authors or people installing software, and is backwards compatible with everything that's already been published using setup_requires. That's the primary problem that I'm aware of. If I try and guess at other problems people might be solving, one might be that in order to use setup_requires you have to delay your imports until after the setup_requires get processed. This typically means you do things like imports inside of functions that get called as part of the setup.py build/install process. This isn't the most fun way to write software, however it works. Specifying the setup_requires in a static location outside would enable pip to then install those things into a temporary directory prior to executing the setup.py which then lets you do imports and other related work at the module scope of the setup.py. This particular problem I'm not sure it's worth fixing with a stop gap solution. It would require breaking the entire existing install base of installation tools if anyone actually took advantage of this fact, which I don't think is generally worth it to have slightly nicer use of things in your setup.py (essentially allowing you to import at the top level and not require subclassing command classes). So yea, what's the actual problem that this is attempting to solve? --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
Problem: Users would like to be able to import stuff in setup.py. This
could be anything from a version fetcher to a replacement for
distutils itself. However, if setup.py is the only place to specify
these requirements there's a bit of a chicken and egg problem, unless
they have unusually good setuptools knowledge, especially if you want
to replace the entire setup() implementation.
Problem: Having easy_install do it is not what people want and misses
some important use cases.
Problem: Based on empirical evidence PEP 426 will never be done. Its
current purpose is to shut down discussion of pragmatic solutions.
Solution: Add requirements to setup.cfg, installed by pip before
setup.py is touched.
Old pip: requirements will not be installed. This is what happens now
if anyone tries to use a non-stdlib module in setup.py, and plenty of
packages do. User will have to install the extra requirements manually
before running setup.py.
Proposed pip: requirements will be installed. Hooray!
Result: Users will begin writing packages that only work with new pip.
If we implement this, users will do the same thing they are already
doing (import non-stdlib packages inside setup.py), only more often.
On Mon, Mar 16, 2015 at 12:03 PM, Donald Stufft
On Mar 16, 2015, at 11:24 AM, Daniel Holth
wrote: We could support this syntax right now. It's so simple. Don't deride it as a pseudo standard, turn it into an actual standard and praise it as something practical that will not take years to implement. Then after those years have passed and the new PEP actually works and has a distutils replacement to drive it, deprecate the old standard.
If you can come up with something better that can ship before 2016, by all means.
[metadata] setup-requires = cffi pip pycparser >= 2.10
It is a psuedo standard unless it is backed by a defined standard. That's not a derision, it's just a fact.
The first step is to determine *what* the problem is that it's actually attempting to solve. That's not clear to me currently other than some vague statements about pain, but what pain? What's actually occuring and how does this address those problems?
After figuring out what the actual problem is, we can look at the proposed solution and see how well it actually solves that problem, if there is maybe a better solution to the problem, and if the benefits outweigh the costs or not.
The ease of implementation is not the only factor in deciding if something is a good idea or not. We have to take into account forwards and backwards compatiblity. If we implement it and people start to depend on it then it's something that's going to have to exist forever, and any new installer is going to have to replicate that behavior. If people don't depend on it then implementing it was a waste of time and effort.
For instance, if the problem is "when setuptools does the install, then things get installed differently, with different options, SSL certs, proxies, etc" then I think a better solution is that pip does terrible hacks in order to forcibly take control of setup_requires from setuptools and installs them into a temporary directory (or something like that). That is something that would require no changes on the part of authors or people installing software, and is backwards compatible with everything that's already been published using setup_requires. That's the primary problem that I'm aware of.
If I try and guess at other problems people might be solving, one might be that in order to use setup_requires you have to delay your imports until after the setup_requires get processed. This typically means you do things like imports inside of functions that get called as part of the setup.py build/install process. This isn't the most fun way to write software, however it works. Specifying the setup_requires in a static location outside would enable pip to then install those things into a temporary directory prior to executing the setup.py which then lets you do imports and other related work at the module scope of the setup.py. This particular problem I'm not sure it's worth fixing with a stop gap solution. It would require breaking the entire existing install base of installation tools if anyone actually took advantage of this fact, which I don't think is generally worth it to have slightly nicer use of things in your setup.py (essentially allowing you to import at the top level and not require subclassing command classes).
So yea, what's the actual problem that this is attempting to solve?
--- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
Donald Stufft wrote:
So yea, what's the actual problem that this is attempting to solve?
ISTM (whether this is the actual intent or not) that this would be handy to differentiate between the dependencies needed when installing from a wheel vs. an sdist. Daniel's example of setup_requires including cython suggests to me that a wheel would include the compiled output and cython is not required in that case. I don't personally have a use case for this right now, though it does seem like it has potential to refer to a Python package that acts as a front-end for a compiler (and perhaps downloader/installer... hmm...) Cheers, Steve
The problem with a no-stopgaps policy is that the non-stopgap solution
has to be incredible to ever be greater than the accrued debt of
((current pain - reduced pain from stopgap) * all python users * years
until non-stopgap) - (maintenance/documentation hassle * years since
stopgap implemented * everyone who has to deal with it), and we do not
know how great the non-stopgap will be.
On Mon, Mar 16, 2015 at 12:35 PM, Steve Dower
Donald Stufft wrote:
So yea, what's the actual problem that this is attempting to solve?
ISTM (whether this is the actual intent or not) that this would be handy to differentiate between the dependencies needed when installing from a wheel vs. an sdist. Daniel's example of setup_requires including cython suggests to me that a wheel would include the compiled output and cython is not required in that case.
I don't personally have a use case for this right now, though it does seem like it has potential to refer to a Python package that acts as a front-end for a compiler (and perhaps downloader/installer... hmm...)
Cheers, Steve
On Mar 16, 2015, at 12:32 PM, Daniel Holth
wrote: Problem: Users would like to be able to import stuff in setup.py. This could be anything from a version fetcher to a replacement for distutils itself. However, if setup.py is the only place to specify these requirements there's a bit of a chicken and egg problem, unless they have unusually good setuptools knowledge, especially if you want to replace the entire setup() implementation.
So you *can* import things inside of a setup.py today, you just have to delay the imports by subclassing a command. You can see an example of doing this with the example command given for pytest in the documentation for pytest [1]. So this problem essentially boils down to people wanting to import at the module scope of their setup.py instead of needing to delay the import for it. This particular problem I believe the solution is worse than the problem. There is a supported solution *today* they can use and it works and importantly it works in all versions of pip and setuptools that I'm aware of. It'a also going to continue to work for years and years. [1] http://pytest.org/latest/goodpractises.html#integration-with-setuptools-test...
Problem: Having easy_install do it is not what people want and misses some important use cases.
This problem I'm aware of, and as I said in my previous email I believe a better interim solution to this problem is to have pip forcibly take control over the setup_requires inside of a setup.py. This has the advantage of requiring nobody to make any changes to their packages so it'll work on all new and existing projects that rely on setup_requires and it's completely self contained within pip.
Problem: Based on empirical evidence PEP 426 will never be done. Its current purpose is to shut down discussion of pragmatic solutions.
This is just FUD and I would appreciate it if you'd stop repeating it. It's only been ~3 months since PEP 440 was completed and released inside of pip, setuptools, and PyPI. I've since switched back over to focusing primarily on getting Warehouse ready to replace PyPI so the bulk of my time is being spent focusing on that. After that's done my plan is to likely switch back to working on putting PEP 426 through the same hard look that I put PEP 440 though to try and iron out as many problems as I can find before implementing it and pushing it out to people. The bulk of the effort of pushing the standards, pip, and PyPI through is done by a handful of people, and of those handful I believe that the largest share is done by myself. That's not to toot my own horn or any such nonsense but to simply state the fact that the available bandwidth of people able and willing to work on problems is low. However the things we bless here as official are things which need to be able to last for a decade or more, which means that they do need careful consideration before we bless them.
Solution: Add requirements to setup.cfg, installed by pip before setup.py is touched.
Old pip: requirements will not be installed. This is what happens now if anyone tries to use a non-stdlib module in setup.py, and plenty of packages do. User will have to install the extra requirements manually before running setup.py.
Proposed pip: requirements will be installed. Hooray!
Result: Users will begin writing packages that only work with new pip.
If we implement this, users will do the same thing they are already doing (import non-stdlib packages inside setup.py), only more often.
On Mon, Mar 16, 2015 at 12:03 PM, Donald Stufft
wrote: On Mar 16, 2015, at 11:24 AM, Daniel Holth
wrote: We could support this syntax right now. It's so simple. Don't deride it as a pseudo standard, turn it into an actual standard and praise it as something practical that will not take years to implement. Then after those years have passed and the new PEP actually works and has a distutils replacement to drive it, deprecate the old standard.
If you can come up with something better that can ship before 2016, by all means.
[metadata] setup-requires = cffi pip pycparser >= 2.10
It is a psuedo standard unless it is backed by a defined standard. That's not a derision, it's just a fact.
The first step is to determine *what* the problem is that it's actually attempting to solve. That's not clear to me currently other than some vague statements about pain, but what pain? What's actually occuring and how does this address those problems?
After figuring out what the actual problem is, we can look at the proposed solution and see how well it actually solves that problem, if there is maybe a better solution to the problem, and if the benefits outweigh the costs or not.
The ease of implementation is not the only factor in deciding if something is a good idea or not. We have to take into account forwards and backwards compatiblity. If we implement it and people start to depend on it then it's something that's going to have to exist forever, and any new installer is going to have to replicate that behavior. If people don't depend on it then implementing it was a waste of time and effort.
For instance, if the problem is "when setuptools does the install, then things get installed differently, with different options, SSL certs, proxies, etc" then I think a better solution is that pip does terrible hacks in order to forcibly take control of setup_requires from setuptools and installs them into a temporary directory (or something like that). That is something that would require no changes on the part of authors or people installing software, and is backwards compatible with everything that's already been published using setup_requires. That's the primary problem that I'm aware of.
If I try and guess at other problems people might be solving, one might be that in order to use setup_requires you have to delay your imports until after the setup_requires get processed. This typically means you do things like imports inside of functions that get called as part of the setup.py build/install process. This isn't the most fun way to write software, however it works. Specifying the setup_requires in a static location outside would enable pip to then install those things into a temporary directory prior to executing the setup.py which then lets you do imports and other related work at the module scope of the setup.py. This particular problem I'm not sure it's worth fixing with a stop gap solution. It would require breaking the entire existing install base of installation tools if anyone actually took advantage of this fact, which I don't think is generally worth it to have slightly nicer use of things in your setup.py (essentially allowing you to import at the top level and not require subclassing command classes).
So yea, what's the actual problem that this is attempting to solve?
--- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
--- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On Mar 16, 2015, at 1:04 PM, Daniel Holth
wrote: The problem with a no-stopgaps policy is that the non-stopgap solution has to be incredible to ever be greater than the accrued debt of ((current pain - reduced pain from stopgap) * all python users * years until non-stopgap) - (maintenance/documentation hassle * years since stopgap implemented * everyone who has to deal with it), and we do not know how great the non-stopgap will be.
There is not a "no stopgaps" policy. There is a "stopgaps must be carefully considered" policy. Stopgaps which don't rely on end users needing to do anything in particular to use them and which pay attention to backwards and forward compatability are better than stopgaps that introduce new APIs/user facing features. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On 16 March 2015 at 17:14, Donald Stufft
The bulk of the effort of pushing the standards, pip, and PyPI through is done by a handful of people, and of those handful I believe that the largest share is done by myself. That's not to toot my own horn or any such nonsense but to simply state the fact that the available bandwidth of people able and willing to work on problems is low. However the things we bless here as official are things which need to be able to last for a decade or more, which means that they do need careful consideration before we bless them.
As a serious question - is there anything I (or indeed anyone else) can do to make progress on PEP 426? If I'm honest, right now I don't exactly see what tool changes are needed to change it from draft to accepted to actually implemented. As far as I can see, acceptance consists largely of someone, somehow, confirming that there are no major loopholes in the spec. I think that mostly comes down to the fact that no-one has raised objections since the PEP was published, plus someone with experience of some of the more difficult distribution scenarios sanity-checking things. And then, getting it implemented in tools. I guess the tool side consists of: 1. Making pip write a pydist.json file when installing from wheel. 2. Making setuptools write pydist.json when installing from sdist, and when creating a sdist. 3. Making wheel write pydist.json when writing a wheel. (Also, when distlib writes a wheel, it should write pydist.json, but I'm considering distlib as "non-core" for the sake of this discussion). There's also presumably work to add support for specifying some of the new metadata in setup.py, which I guess is setuptools work again. Have I missed anything crucial? Paul PS I'm ignoring the "standard metadata extensions" PEP where console wrappers, and post-install scripts figure. Those are probably bigger design issues.
No one should be asked to learn how to extend distutils, and in practice no one knows how. People have been begging for years for working setup_requires, far longer than I've been interested in it, and all they want to do is import fetch_version setup(version=fetch_version(), ...) Then they will eventually notice setup_requires has never worked the way most people expect. As a result there are too few setup.py abstractions. The other proposal is a little bit interesting. Parse setup.py without running it, extract setup_requires, and pip install locally before running the file? It would be easy as long as the setup_requires were defined as a literal list in the setup() call, but you would have to tell people they were not allowed to write normal clever Python code. I think the gotchas would be severe... Release a setuptools command class that actually works with regular setup_requires, and parses setup_requires out of a side file? But fails fetch_version case... The main reason the installer should handle setup_requires instead of setup.py is related to one of the major goals of packaging, which is to get setup.py out of the business of installing (it is OK if it is a build script). Would you be interested in a JSON-format metadata that you were willing to support long term, with a quick version 0.1 release that only adds the setup_requires feature?
On Mar 16, 2015, at 2:33 PM, Paul Moore
wrote: On 16 March 2015 at 17:14, Donald Stufft
wrote: The bulk of the effort of pushing the standards, pip, and PyPI through is done by a handful of people, and of those handful I believe that the largest share is done by myself. That's not to toot my own horn or any such nonsense but to simply state the fact that the available bandwidth of people able and willing to work on problems is low. However the things we bless here as official are things which need to be able to last for a decade or more, which means that they do need careful consideration before we bless them.
As a serious question - is there anything I (or indeed anyone else) can do to make progress on PEP 426? If I'm honest, right now I don't exactly see what tool changes are needed to change it from draft to accepted to actually implemented.
As far as I can see, acceptance consists largely of someone, somehow, confirming that there are no major loopholes in the spec. I think that mostly comes down to the fact that no-one has raised objections since the PEP was published, plus someone with experience of some of the more difficult distribution scenarios sanity-checking things. And then, getting it implemented in tools.
I guess the tool side consists of:
1. Making pip write a pydist.json file when installing from wheel. 2. Making setuptools write pydist.json when installing from sdist, and when creating a sdist. 3. Making wheel write pydist.json when writing a wheel.
(Also, when distlib writes a wheel, it should write pydist.json, but I'm considering distlib as "non-core" for the sake of this discussion).
There's also presumably work to add support for specifying some of the new metadata in setup.py, which I guess is setuptools work again.
Have I missed anything crucial? Paul
PS I'm ignoring the "standard metadata extensions" PEP where console wrappers, and post-install scripts figure. Those are probably bigger design issues.
Probably similar to what I did for PEP 440. Start branches implementing it and try to run as much stuff through it as possible to see how it fails. When implementing PEP 440 I was running every version number that existed on PyPI through it to test to make sure things would work. That’s not going to be possible with PEP 426, but ideally we should be able to get branches in the various tools that work on it, grab the latest versions of the top N packages (or the latest versions of everything) and compare the results. Another thing is determining if there's anything else we can/should split out from PEP 426 to narrow the scope. A quick skim to refresh myself doesn't show me anything that stands out, but some thought to this wouldn't hurt. It’d also probably not hurt to go through the setuptools and pip bug trackers to see if there are any relevant issues and see how they would be effected by the new standard. PEP 426 itself isn't much in the way of groundbreaking. It's basically taking the dynamic metadata and "compiling" it down into a static form which is JSON based. The "easy" metadata (name etc) is a no brainer and probably doesn't require much poking, trying out an example of an extension probably wouldn't be the worst thing either. Even if stanardizing script wrappers, for instance, is held off, getting a demo extension using it would validate the standard. Potentially more thought/guidance should be mentioned in how projects should straddle the line between legacy systems. If a new style metadata exists in a format in addition to an old syle should the new style be preferred? Should there be any sanity checks to make sure the two aren't completely different? As an aside, one thing I've had in the back of my mind, is looking at the possibility of defining the environment markers which specify versions as PEP 440 versions and enabling all the same comparisons with them as we get in the specifiers. I think the string based comparison we currently do is kinda janky and any installer already has the tooling to handle specifiers. Generally what I did with PEP 440, which I think worked well, is I had everything pretty much implemented prior to it get accepting, and we were then able to use the things I found out while implementing it to adjust the PEP before it was formally accepted. I just didn’t merge anything until that point. This was pretty valuable in findings things where the PEP was too vague for someone to make an indepdent implementation going by what was in the PEP, or if it specified something where implementing it turned out to be hard/problematic, etc. A major reason why I’m personally focusing on Warehouse first is that integrating with Warehouse will be easier than integrating with PyPI legacy. However that doesn’t have to block anyone else, that’s just myself not wanting to spend the time integrating a new major metadata version into the old legacy code base. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On Mar 16, 2015, at 2:53 PM, Daniel Holth
wrote: No one should be asked to learn how to extend distutils, and in practice no one knows how.
Some people know how, pytest figured it out, pbr figured it out. There’s some documentation at https://pythonhosted.org/setuptools/setuptools.html#extending-and-reusing-se... It is true though that it’s not particularly great documentation and actually doing it is somewhat of an annoyance.
People have been begging for years for working setup_requires, far longer than I've been interested in it, and all they want to do is
import fetch_version setup(version=fetch_version(), ...)
Then they will eventually notice setup_requires has never worked the way most people expect. As a result there are too few setup.py abstractions.
The other proposal is a little bit interesting. Parse setup.py without running it, extract setup_requires, and pip install locally before running the file? It would be easy as long as the setup_requires were defined as a literal list in the setup() call, but you would have to tell people they were not allowed to write normal clever Python code. I think the gotchas would be severe…
I wasn’t going to try and parse the setup.py, I was going to execute it. Here’s a proof of concept I started on awhile back to try and validate the overall idea: https://github.com/pypa/pip/compare/develop...dstufft:eldritch-horror Since then I’ve thought about it more and decided it’d probably be better to instead of trying to shuffle arguments into the subprocess, have the subprocess write out into a file or stdout or something what all of the setup_requires are. This would require executing the setup.py 3x instead of 2x like pip is currently doing. This would also enable people do something like: try: import fetch_version except ImportError: def fetch_version(): return “UNKNOWN” setup(version=fetch_version(), …) If they are happy with mandating that their thing can only be installed from sdist with pip newer than X, because given a three pass installation (once to discover setup_requires, once to write egg_info, once to actually install) as long as the setup_requires list doesn’t rely on anything installed then the first pass can have no real information except the setup_requires. It actually wouldn’t even really be completely broken, it’d just have a nonsense version number (or whatever piece of metadata can’t be located).
Release a setuptools command class that actually works with regular setup_requires, and parses setup_requires out of a side file? But fails fetch_version case...
The main reason the installer should handle setup_requires instead of setup.py is related to one of the major goals of packaging, which is to get setup.py out of the business of installing (it is OK if it is a build script).
Would you be interested in a JSON-format metadata that you were willing to support long term, with a quick version 0.1 release that only adds the setup_requires feature?
--- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On Mon, Mar 16, 2015 at 3:19 PM, Donald Stufft
On Mar 16, 2015, at 2:53 PM, Daniel Holth
wrote: No one should be asked to learn how to extend distutils, and in practice no one knows how.
Some people know how, pytest figured it out, pbr figured it out. There’s some documentation at https://pythonhosted.org/setuptools/setuptools.html#extending-and-reusing-se...
It is true though that it’s not particularly great documentation and actually doing it is somewhat of an annoyance.
People have been begging for years for working setup_requires, far longer than I've been interested in it, and all they want to do is
import fetch_version setup(version=fetch_version(), ...)
Then they will eventually notice setup_requires has never worked the way most people expect. As a result there are too few setup.py abstractions.
The other proposal is a little bit interesting. Parse setup.py without running it, extract setup_requires, and pip install locally before running the file? It would be easy as long as the setup_requires were defined as a literal list in the setup() call, but you would have to tell people they were not allowed to write normal clever Python code. I think the gotchas would be severe…
I wasn’t going to try and parse the setup.py, I was going to execute it.
Here’s a proof of concept I started on awhile back to try and validate the overall idea: https://github.com/pypa/pip/compare/develop...dstufft:eldritch-horror
Since then I’ve thought about it more and decided it’d probably be better to instead of trying to shuffle arguments into the subprocess, have the subprocess write out into a file or stdout or something what all of the setup_requires are. This would require executing the setup.py 3x instead of 2x like pip is currently doing.
This would also enable people do something like:
try: import fetch_version except ImportError: def fetch_version(): return “UNKNOWN”
setup(version=fetch_version(), …)
If they are happy with mandating that their thing can only be installed from sdist with pip newer than X, because given a three pass installation (once to discover setup_requires, once to write egg_info, once to actually install) as long as the setup_requires list doesn’t rely on anything installed then the first pass can have no real information except the setup_requires.
It actually wouldn’t even really be completely broken, it’d just have a nonsense version number (or whatever piece of metadata can’t be located).
But it would still work in older pip if the setup requirements were installed already? Users would have to try/catch every import? Explain why this is better than reading out of a different file in the sdist, no matter the format? Would it let you change your setup_requires with Python code before the initial .egg-info is written out? We've already talked too many times about my 34-line setup.py prefix that does exactly what I want by parsing and installing requirements from a config file, but people tend to complain about it not being a pip feature. If it would help, I could have it accept a different format, then it would be possible to publish backwards-compatible-with-pip sdists that parsed some preferable requirements format. If they were already installed the bw compat code would do nothing. https://bitbucket.org/dholth/setup-requires/src/03eda33c7681bc4102164c976e5a...
You ought to be able to get away with not supporting it in warehouse,
we will surely be able to write legacy .egg-info to sdists
indefinitely.
On Mon, Mar 16, 2015 at 3:09 PM, Donald Stufft
On Mar 16, 2015, at 2:33 PM, Paul Moore
wrote: On 16 March 2015 at 17:14, Donald Stufft
wrote: The bulk of the effort of pushing the standards, pip, and PyPI through is done by a handful of people, and of those handful I believe that the largest share is done by myself. That's not to toot my own horn or any such nonsense but to simply state the fact that the available bandwidth of people able and willing to work on problems is low. However the things we bless here as official are things which need to be able to last for a decade or more, which means that they do need careful consideration before we bless them.
As a serious question - is there anything I (or indeed anyone else) can do to make progress on PEP 426? If I'm honest, right now I don't exactly see what tool changes are needed to change it from draft to accepted to actually implemented.
As far as I can see, acceptance consists largely of someone, somehow, confirming that there are no major loopholes in the spec. I think that mostly comes down to the fact that no-one has raised objections since the PEP was published, plus someone with experience of some of the more difficult distribution scenarios sanity-checking things. And then, getting it implemented in tools.
I guess the tool side consists of:
1. Making pip write a pydist.json file when installing from wheel. 2. Making setuptools write pydist.json when installing from sdist, and when creating a sdist. 3. Making wheel write pydist.json when writing a wheel.
(Also, when distlib writes a wheel, it should write pydist.json, but I'm considering distlib as "non-core" for the sake of this discussion).
There's also presumably work to add support for specifying some of the new metadata in setup.py, which I guess is setuptools work again.
Have I missed anything crucial? Paul
PS I'm ignoring the "standard metadata extensions" PEP where console wrappers, and post-install scripts figure. Those are probably bigger design issues.
Probably similar to what I did for PEP 440.
Start branches implementing it and try to run as much stuff through it as possible to see how it fails. When implementing PEP 440 I was running every version number that existed on PyPI through it to test to make sure things would work. That’s not going to be possible with PEP 426, but ideally we should be able to get branches in the various tools that work on it, grab the latest versions of the top N packages (or the latest versions of everything) and compare the results.
Another thing is determining if there's anything else we can/should split out from PEP 426 to narrow the scope. A quick skim to refresh myself doesn't show me anything that stands out, but some thought to this wouldn't hurt.
It’d also probably not hurt to go through the setuptools and pip bug trackers to see if there are any relevant issues and see how they would be effected by the new standard.
PEP 426 itself isn't much in the way of groundbreaking. It's basically taking the dynamic metadata and "compiling" it down into a static form which is JSON based. The "easy" metadata (name etc) is a no brainer and probably doesn't require much poking, trying out an example of an extension probably wouldn't be the worst thing either. Even if stanardizing script wrappers, for instance, is held off, getting a demo extension using it would validate the standard.
Potentially more thought/guidance should be mentioned in how projects should straddle the line between legacy systems. If a new style metadata exists in a format in addition to an old syle should the new style be preferred? Should there be any sanity checks to make sure the two aren't completely different?
As an aside, one thing I've had in the back of my mind, is looking at the possibility of defining the environment markers which specify versions as PEP 440 versions and enabling all the same comparisons with them as we get in the specifiers. I think the string based comparison we currently do is kinda janky and any installer already has the tooling to handle specifiers.
Generally what I did with PEP 440, which I think worked well, is I had everything pretty much implemented prior to it get accepting, and we were then able to use the things I found out while implementing it to adjust the PEP before it was formally accepted. I just didn’t merge anything until that point. This was pretty valuable in findings things where the PEP was too vague for someone to make an indepdent implementation going by what was in the PEP, or if it specified something where implementing it turned out to be hard/problematic, etc.
A major reason why I’m personally focusing on Warehouse first is that integrating with Warehouse will be easier than integrating with PyPI legacy. However that doesn’t have to block anyone else, that’s just myself not wanting to spend the time integrating a new major metadata version into the old legacy code base.
--- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On 17 March 2015 at 04:06, Donald Stufft
Thoughts?
I've been thinking about this proposal this morning, and my primary question is what exactly is the pain that is being caused right now, and how does this proposal help it? Is the pain that setuptools is doing the installation instead of pip? Is that pain that the dependencies are being installed into a .eggs directory instead of globally? Is it something else?
Thank you for thinking about it.
I'm hesitant to want to add another psuedo standard ontop of the pile of implementation defined psuedo standards we already have, especially without fully understanding what the underlying pain point actually is and how the proposal addresses it.
There are I think two major pain points: ease of use of
not-already-usable-code and different installation logic.
Different logic:
For instance, in a clean venv checkout something with setup_requires
(e.g. testtools) and do pip install -e . followed by unit2. For me at
least this doesn't work. It ends up installing local .eggs which then
aren't actually usable as they aren't on the path when unit2 runs.
Not already-usable-code:
See for instance
https://hg.python.org/unittest2/file/8928fb47c3a9/setup.py#l13 or
similar hacks everywhere.
Those are the pain points. I get your concern about pseudo standards -
so, what is the bar needed to put what I proposed into PEP-426 (or a
new one?) - as previously stated and not (AFAICT) refuted, PEP-426
doesn't actually address either of these pain points today, since it
requires an sdist to be buildable before its metadata is accessible.
It's entirely reasonable to want whatever we do do to solve developer
pain dovetail nicely with PEP-426, and in fact that was the reason I
started a thread here rather than just whacking together a patch for
pip :)
The proposal addresses the two pain points in the following manner:
Not already usable code:
- by statically declaring the dependencies, no local code runs at
all before they are installed. It won't solve things like 'build this
local .so before xyz', but thats OK IMO.
Different installation logic:
- pip (or buildout or whatever) can avoid chaining into easy_install
consistently and trivially, thus avoiding that
Your proposal later in this three to do a three-way dance seems more
complicated than a static expression of setup requirements, and I see
no reason to have dynamic *setup* requirements. Both approaches
require a new pip, so the adoption curve constraints appear identical.
-Rob
--
Robert Collins
On Mar 16, 2015, at 3:32 PM, Daniel Holth
wrote: On Mon, Mar 16, 2015 at 3:19 PM, Donald Stufft
wrote: On Mar 16, 2015, at 2:53 PM, Daniel Holth
wrote: No one should be asked to learn how to extend distutils, and in practice no one knows how.
Some people know how, pytest figured it out, pbr figured it out. There’s some documentation at https://pythonhosted.org/setuptools/setuptools.html#extending-and-reusing-se...
It is true though that it’s not particularly great documentation and actually doing it is somewhat of an annoyance.
People have been begging for years for working setup_requires, far longer than I've been interested in it, and all they want to do is
import fetch_version setup(version=fetch_version(), ...)
Then they will eventually notice setup_requires has never worked the way most people expect. As a result there are too few setup.py abstractions.
The other proposal is a little bit interesting. Parse setup.py without running it, extract setup_requires, and pip install locally before running the file? It would be easy as long as the setup_requires were defined as a literal list in the setup() call, but you would have to tell people they were not allowed to write normal clever Python code. I think the gotchas would be severe…
I wasn’t going to try and parse the setup.py, I was going to execute it.
Here’s a proof of concept I started on awhile back to try and validate the overall idea: https://github.com/pypa/pip/compare/develop...dstufft:eldritch-horror
Since then I’ve thought about it more and decided it’d probably be better to instead of trying to shuffle arguments into the subprocess, have the subprocess write out into a file or stdout or something what all of the setup_requires are. This would require executing the setup.py 3x instead of 2x like pip is currently doing.
This would also enable people do something like:
try: import fetch_version except ImportError: def fetch_version(): return “UNKNOWN”
setup(version=fetch_version(), …)
If they are happy with mandating that their thing can only be installed from sdist with pip newer than X, because given a three pass installation (once to discover setup_requires, once to write egg_info, once to actually install) as long as the setup_requires list doesn’t rely on anything installed then the first pass can have no real information except the setup_requires.
It actually wouldn’t even really be completely broken, it’d just have a nonsense version number (or whatever piece of metadata can’t be located).
But it would still work in older pip if the setup requirements were installed already? Users would have to try/catch every import? Explain why this is better than reading out of a different file in the sdist, no matter the format? Would it let you change your setup_requires with Python code before the initial .egg-info is written out?
It’s better because it solves the second problem you mentioned “people don’t want easy_install to install things” for everything that uses setup_requires with no effort required on package authors. And yes it would still work in older pip if the setup requirements were installed already. It would also continue to work in setuptools/easy_install if they were installed already. It makes the existing mechanism somewhat better instead of trying to replace it with a whole new mechanism. It’s also a more general solution, for people who aren’t willing to break older things, they continue to use setup_requires with delayed imports and they get pip installing things for them. For people who don’t want to delay imports but are happy breaking older things, it lets them do what they want with only a minor amount of discomfortable (needing to catch an import error).
We've already talked too many times about my 34-line setup.py prefix that does exactly what I want by parsing and installing requirements from a config file, but people tend to complain about it not being a pip feature. If it would help, I could have it accept a different format, then it would be possible to publish backwards-compatible-with-pip sdists that parsed some preferable requirements format. If they were already installed the bw compat code would do nothing. https://bitbucket.org/dholth/setup-requires/src/03eda33c7681bc4102164c976e5a...
--- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On Mar 16, 2015, at 4:04 PM, Robert Collins
wrote: On 17 March 2015 at 04:06, Donald Stufft
wrote: Thoughts?
I've been thinking about this proposal this morning, and my primary question is what exactly is the pain that is being caused right now, and how does this proposal help it? Is the pain that setuptools is doing the installation instead of pip? Is that pain that the dependencies are being installed into a .eggs directory instead of globally? Is it something else?
Thank you for thinking about it.
I'm hesitant to want to add another psuedo standard ontop of the pile of implementation defined psuedo standards we already have, especially without fully understanding what the underlying pain point actually is and how the proposal addresses it.
There are I think two major pain points: ease of use of not-already-usable-code and different installation logic.
Different logic: For instance, in a clean venv checkout something with setup_requires (e.g. testtools) and do pip install -e . followed by unit2. For me at least this doesn't work. It ends up installing local .eggs which then aren't actually usable as they aren't on the path when unit2 runs.
Ahhhh, wait a minute. I think something might have just clicked here. You’re expecting/wanting the results of setup_requres to be installed into the environment itself and not just made available to the setup.py? That’s not going to work and I’d be against making that work. For something like that I’d say it would more cleanly map to something like tests_requires (in setup.py and PEP 426) and dev_requires (in PEP 426). I think that it would be reasonable for pip to install both of those types of requirements into the environment when you’re installing as an editable installation.
Not already-usable-code: See for instance https://hg.python.org/unittest2/file/8928fb47c3a9/setup.py#l13 or similar hacks everywhere.
Those are the pain points. I get your concern about pseudo standards - so, what is the bar needed to put what I proposed into PEP-426 (or a new one?) - as previously stated and not (AFAICT) refuted, PEP-426 doesn't actually address either of these pain points today, since it requires an sdist to be buildable before its metadata is accessible. It's entirely reasonable to want whatever we do do to solve developer pain dovetail nicely with PEP-426, and in fact that was the reason I started a thread here rather than just whacking together a patch for pip :)
The proposal addresses the two pain points in the following manner: Not already usable code: - by statically declaring the dependencies, no local code runs at all before they are installed. It won't solve things like 'build this local .so before xyz', but thats OK IMO. Different installation logic: - pip (or buildout or whatever) can avoid chaining into easy_install consistently and trivially, thus avoiding that
Your proposal later in this three to do a three-way dance seems more complicated than a static expression of setup requirements, and I see no reason to have dynamic *setup* requirements. Both approaches require a new pip, so the adoption curve constraints appear identical.
So it appears there’s actually two problems here, one is the one above, that you want some sort of “these are required to do development” requirements, and that setup_requires has some problems (it’s inside of setup.py, and it’s installed by easy_install instead of pip). Ignoring the “development requirement” problem (even though I think that’s a more interesting problem!) for a moment, I think that yea it’d be great to specify setup_requires statically, but that right now defining requirements as Python inside of setup.py is the standard we have. I’m aware that pbr routes around this standard, but that’s pbr and it’s not hardly the norm. I think that it’s worse to have a weird one off place to specify a particular type of dependency than to continue to use the normal mechanism and add things in pip to work around the deficiencies in that. The other benefit to my proposal is that every existing use of setup_requires starts to get installed by pip instead of by easy_install, which solves a whole class of problems like not supporting Wheels, proxy settings, SSL settings, etc. Going back to the development requirement problem, I think it would be reasonable for setuptools to start to gain some of the concepts from PEP 426, it already has tests_requires and I think that an official dev_requires wouldn’t be a bad idea either. If it then exposed those things as something pip could inspect we could start doing things like automatically installing them inside of a development installation. This would probably even allow backwards compat by having a setup.py dynamically add things to the setup_requires based upon what version of setuptools is executing the setup.py. If it’s an older one, add a shim that’ll implement the new functionality as a plugin instead of part of core. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
Yes setup_requires means those dependencies that are needed for
setup.py itself to run.
On Mon, Mar 16, 2015 at 4:42 PM, Donald Stufft
On Mar 16, 2015, at 4:04 PM, Robert Collins
wrote: On 17 March 2015 at 04:06, Donald Stufft
wrote: Thoughts?
I've been thinking about this proposal this morning, and my primary question is what exactly is the pain that is being caused right now, and how does this proposal help it? Is the pain that setuptools is doing the installation instead of pip? Is that pain that the dependencies are being installed into a .eggs directory instead of globally? Is it something else?
Thank you for thinking about it.
I'm hesitant to want to add another psuedo standard ontop of the pile of implementation defined psuedo standards we already have, especially without fully understanding what the underlying pain point actually is and how the proposal addresses it.
There are I think two major pain points: ease of use of not-already-usable-code and different installation logic.
Different logic: For instance, in a clean venv checkout something with setup_requires (e.g. testtools) and do pip install -e . followed by unit2. For me at least this doesn't work. It ends up installing local .eggs which then aren't actually usable as they aren't on the path when unit2 runs.
Ahhhh, wait a minute. I think something might have just clicked here.
You’re expecting/wanting the results of setup_requres to be installed into the environment itself and not just made available to the setup.py? That’s not going to work and I’d be against making that work.
For something like that I’d say it would more cleanly map to something like tests_requires (in setup.py and PEP 426) and dev_requires (in PEP 426). I think that it would be reasonable for pip to install both of those types of requirements into the environment when you’re installing as an editable installation.
Not already-usable-code: See for instance https://hg.python.org/unittest2/file/8928fb47c3a9/setup.py#l13 or similar hacks everywhere.
Those are the pain points. I get your concern about pseudo standards - so, what is the bar needed to put what I proposed into PEP-426 (or a new one?) - as previously stated and not (AFAICT) refuted, PEP-426 doesn't actually address either of these pain points today, since it requires an sdist to be buildable before its metadata is accessible. It's entirely reasonable to want whatever we do do to solve developer pain dovetail nicely with PEP-426, and in fact that was the reason I started a thread here rather than just whacking together a patch for pip :)
The proposal addresses the two pain points in the following manner: Not already usable code: - by statically declaring the dependencies, no local code runs at all before they are installed. It won't solve things like 'build this local .so before xyz', but thats OK IMO. Different installation logic: - pip (or buildout or whatever) can avoid chaining into easy_install consistently and trivially, thus avoiding that
Your proposal later in this three to do a three-way dance seems more complicated than a static expression of setup requirements, and I see no reason to have dynamic *setup* requirements. Both approaches require a new pip, so the adoption curve constraints appear identical.
So it appears there’s actually two problems here, one is the one above, that you want some sort of “these are required to do development” requirements, and that setup_requires has some problems (it’s inside of setup.py, and it’s installed by easy_install instead of pip).
Ignoring the “development requirement” problem (even though I think that’s a more interesting problem!) for a moment, I think that yea it’d be great to specify setup_requires statically, but that right now defining requirements as Python inside of setup.py is the standard we have. I’m aware that pbr routes around this standard, but that’s pbr and it’s not hardly the norm. I think that it’s worse to have a weird one off place to specify a particular type of dependency than to continue to use the normal mechanism and add things in pip to work around the deficiencies in that.
The other benefit to my proposal is that every existing use of setup_requires starts to get installed by pip instead of by easy_install, which solves a whole class of problems like not supporting Wheels, proxy settings, SSL settings, etc.
Going back to the development requirement problem, I think it would be reasonable for setuptools to start to gain some of the concepts from PEP 426, it already has tests_requires and I think that an official dev_requires wouldn’t be a bad idea either. If it then exposed those things as something pip could inspect we could start doing things like automatically installing them inside of a development installation. This would probably even allow backwards compat by having a setup.py dynamically add things to the setup_requires based upon what version of setuptools is executing the setup.py. If it’s an older one, add a shim that’ll implement the new functionality as a plugin instead of part of core.
--- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
On 17 March 2015 at 09:42, Donald Stufft
..
Ahhhh, wait a minute. I think something might have just clicked here.
You’re expecting/wanting the results of setup_requres to be installed into the environment itself and not just made available to the setup.py? That’s not going to work and I’d be against making that work.
One of the common things I do is 'python setup.py sdist bdist_wheel upload -s'. That requires setup.py to work, and is a wonky world of pain right now. Anything that depends on pip intercepting setup.py isn't going to work there, is it? Debian build environments for these things generally run without the internet, without access to a pip mirror - distributors are used to having the setup requirements be safely installable, and I'd argue that setup_requires that violate that rule have been getting pushback and selection pressure for years. So I very much doubt there is a case where installing the deps is bad. I can well imagine the lack of resolver in pip issue being a concern, but its a concern regardless.
For something like that I’d say it would more cleanly map to something like tests_requires (in setup.py and PEP 426) and dev_requires (in PEP 426). I think that it would be reasonable for pip to install both of those types of requirements into the environment when you’re installing as an editable installation.
I can grok that, though I suspect it runs the risk of being over-modelled. Do we know any uses of setup_requires where installing the requirements into the build environment would do the wrong thing? E.g. is it a theoretical concern, or a omg X would do Y we've-been-bitten-before issue? ..
So it appears there’s actually two problems here, one is the one above, that you want some sort of “these are required to do development” requirements, and that setup_requires has some problems (it’s inside of setup.py, and it’s installed by easy_install instead of pip).
Sure. Note too that folk need to be able to run setup.py without triggering easy_install and without running pip. Thats a requirement for e.g. Debian. The way thats handled today is to have the build fail, look at it, and add build-depends lines to debian/control - the explicit metadata in Debian. If we had explicit metadata for Python sources (git, not dists), then folk could use that to reflect dependencies across semi-automatically (e.g. flagging new ones more clearly).
Ignoring the “development requirement” problem (even though I think that’s a more interesting problem!) for a moment, I think that yea it’d be great to specify setup_requires statically, but that right now defining requirements as Python inside of setup.py is the standard we have. I’m aware that pbr routes around this standard, but that’s pbr and it’s not hardly the norm. I think that it’s worse to have a weird one off place to specify a particular type of dependency than to continue to use the normal mechanism and add things in pip to work around the deficiencies in that.
A concern I haven't expressed so far is that the route you're proposing is very clever. Clever tends to break, be hard to diagnose and hard to understand. I understand the benefit to folk for all the stale-won't-update packages out there, and perhaps we can do multiple things that all independently solve the issue. That is, what if we do explicit metadata *and* have pip do wonky magic.
The other benefit to my proposal is that every existing use of setup_requires starts to get installed by pip instead of by easy_install, which solves a whole class of problems like not supporting Wheels, proxy settings, SSL settings, etc.
Yep, which is beneficial on its own. But not a reason not to do explicit metadata :).
Going back to the development requirement problem, I think it would be reasonable for setuptools to start to gain some of the concepts from PEP 426, it already has tests_requires and I think that an official dev_requires wouldn’t be a bad idea either. If it then exposed those things as something pip could inspect we could start doing things like automatically installing them inside of a development installation. This would probably even allow backwards compat by having a setup.py dynamically add things to the setup_requires based upon what version of setuptools is executing the setup.py. If it’s an older one, add a shim that’ll implement the new functionality as a plugin instead of part of core.
Ok, so what I need to know is what:
- can I do
- that solves my problem (not the other 1000 problems that PEP-426 solves)
- that the setuptools // pip maintainers would be willing to merge.
I'm happy to put tuits into this, but I don't want to boil the ocean -
I want to solve the specific thing that makes me curse my laptop
screen on a regular basis.
-Rob
--
Robert Collins
On 17 Mar 2015 02:33, "Daniel Holth"
Problem: Users would like to be able to import stuff in setup.py. This could be anything from a version fetcher to a replacement for distutils itself. However, if setup.py is the only place to specify these requirements there's a bit of a chicken and egg problem, unless they have unusually good setuptools knowledge, especially if you want to replace the entire setup() implementation.
Problem: Having easy_install do it is not what people want and misses some important use cases.
Problem: Based on empirical evidence PEP 426 will never be done. Its current purpose is to shut down discussion of pragmatic solutions.
Slight correction here: one of my current aims with PEP 426 is deliberately discouraging the discussion of solutions that only work reliably if everyone switches to a new build system first. That's a) never going to happen; and b) one of the key mistakes the distutils2 folks made that significantly hindered adoption of their work, and I don't want us to repeat it. My other key aim is to provide a public definition of what I think "good" looks like when it comes to software distribution, so I can more easily assess whether less radical proposals are still moving us closer to that goal. Making pip (and perhaps easy_install) setup.cfg aware, such that it assumes the use of d2to1 (or a semantically equivalent tool) if setup.cfg is present and hence is able to skip invoking setup.py in relevant cases, sounds like just such a positive incremental step to me, as it increases the number of situations where pip can avoid executing a Turing complete "configuration" file, without impeding the eventual adoption of a more comprehensive solution. I don't think that needs a PEP - just an RFE against pip to make it d2to1 aware for each use case where it's relevant, like installing setup.py dependencies. (And perhaps a similar RFE against setuptools) Projects that choose to rely on that new feature will be setting a high minimum installer version for their users, but some projects will be OK with that (especially projects private to a single organisation after upgrading pip on their production systems). Cheers, Nick.
On 17 Mar 2015 04:33, "Paul Moore"
On 16 March 2015 at 17:14, Donald Stufft
wrote: The bulk of the effort of pushing the standards, pip, and PyPI through
by a handful of people, and of those handful I believe that the largest share is done by myself. That's not to toot my own horn or any such nonsense but to simply state the fact that the available bandwidth of people able and willing to work on problems is low. However the things we bless here as official are things which need to be able to last for a decade or more, which means
is done that
they do need careful consideration before we bless them.
As a serious question - is there anything I (or indeed anyone else) can do to make progress on PEP 426? If I'm honest, right now I don't exactly see what tool changes are needed to change it from draft to accepted to actually implemented.
The main bottleneck where PEP 426 is concerned is me, and my current focus is on Red Hat & Project Atomic (e.g. http://connect.redhat.com/zones/containers, https://github.com/projectatomic/adb-atomic-developer-bundle) and the PSF (e.g. https://wiki.python.org/moin/PythonSoftwareFoundation/ProposalsForDiscussion... ) The main issue with PEP 426 is that are a few details in the current draft that I already think are a bad idea but haven't explicitly documented anywhere, so I need to get back and address those before it makes sense for anyone to start serious work on implementing it. However, now that I know folks are keen to help with that side, I can reprioritise getting the updates done so there's a better base to start working from. It's also worth noting that the main tweaks I want to make are specifically related to the way semantic dependencies are proposed to be defined, so everything outside that should already be fair game for formal documentation in an updated JSON schema. Cheers, Nick.
On Mar 16, 2015, at 6:35 PM, Robert Collins
wrote: On 17 March 2015 at 09:42, Donald Stufft
wrote: ..
Ahhhh, wait a minute. I think something might have just clicked here.
You’re expecting/wanting the results of setup_requres to be installed into the environment itself and not just made available to the setup.py? That’s not going to work and I’d be against making that work.
One of the common things I do is 'python setup.py sdist bdist_wheel upload -s'. That requires setup.py to work, and is a wonky world of pain right now. Anything that depends on pip intercepting setup.py isn't going to work there, is it? Debian build environments for these things generally run without the internet, without access to a pip mirror - distributors are used to having the setup requirements be safely installable, and I'd argue that setup_requires that violate that rule have been getting pushback and selection pressure for years. So I very much doubt there is a case where installing the deps is bad. I can well imagine the lack of resolver in pip issue being a concern, but its a concern regardless.
I’m not sure what solution is actually going to work here besides something like setup_requires. If you’re executing ``python setup.py …`` then the only "hooks" that exist (and as far as I can tell, can exist) are ones that exist now. The only real improvement I can think of is setuptools offering a seperate API call to install the setup_requires so you can do something like: import setuptools setuptools.setup_requires("pbr", "otherthing") import pbr setuptools.setup(**pbr.get_metadata()) Anything that relies on parsing a setup.cfg is going to be roughly equivilant to that (or the current solution) because there's no other place to call that will make ``python setup.py`` work. Unless I'm missing something, how does having a setup_requires inside of setup.cfg help with this sequence of commands: $ git clone .... $ cd .... $ virtualenv .env $ .env/bin/python setup.py sdist
For something like that I’d say it would more cleanly map to something like tests_requires (in setup.py and PEP 426) and dev_requires (in PEP 426). I think that it would be reasonable for pip to install both of those types of requirements into the environment when you’re installing as an editable installation.
I can grok that, though I suspect it runs the risk of being over-modelled. Do we know any uses of setup_requires where installing the requirements into the build environment would do the wrong thing? E.g. is it a theoretical concern, or a omg X would do Y we've-been-bitten-before issue? ..
setup_requires don’t just run in “build” environments, they also run anytime you install from sdist in a final environment. To answer the question though, yes there are real concerns, some of the current setuptools extensions don’t play well if they are installed and available when you’re installing something that *doesn’t* use them. You could argue that this is a bug with those particular things, but currently some things do assume that if they are installed they are expected to be doing things to setup.py. Further more, I’m not sure that’s actually possible for this to generically work, if two different setuptools extensions both want to override the same thing, only one of them can work. So if you had two things that relied on overriding the install command (or the build command, or whatever), one would “win” and the other would act as if it were not installed.
So it appears there’s actually two problems here, one is the one above, that you want some sort of “these are required to do development” requirements, and that setup_requires has some problems (it’s inside of setup.py, and it’s installed by easy_install instead of pip).
Sure. Note too that folk need to be able to run setup.py without triggering easy_install and without running pip. Thats a requirement for e.g. Debian. The way thats handled today is to have the build fail, look at it, and add build-depends lines to debian/control - the explicit metadata in Debian. If we had explicit metadata for Python sources (git, not dists), then folk could use that to reflect dependencies across semi-automatically (e.g. flagging new ones more clearly).
I don’t think most Debian (or Linux in General) Distributions are consuming packages directly from VCS. As I understand it from following debian-python the folks packaging Openstack are the outliers in that. That being said, most of the work thus far have ignored the part of the process where what we have is a VCS checkout. That's because it's somewhat dependent on breaking the reference cycle where the entire toolchain only really works because we assume setuptools everywhere.
Ignoring the “development requirement” problem (even though I think that’s a more interesting problem!) for a moment, I think that yea it’d be great to specify setup_requires statically, but that right now defining requirements as Python inside of setup.py is the standard we have. I’m aware that pbr routes around this standard, but that’s pbr and it’s not hardly the norm. I think that it’s worse to have a weird one off place to specify a particular type of dependency than to continue to use the normal mechanism and add things in pip to work around the deficiencies in that.
A concern I haven't expressed so far is that the route you're proposing is very clever. Clever tends to break, be hard to diagnose and hard to understand. I understand the benefit to folk for all the stale-won't-update packages out there, and perhaps we can do multiple things that all independently solve the issue. That is, what if we do explicit metadata *and* have pip do wonky magic.
Yea, clever is often times broken, I can agree with that.
The other benefit to my proposal is that every existing use of setup_requires starts to get installed by pip instead of by easy_install, which solves a whole class of problems like not supporting Wheels, proxy settings, SSL settings, etc.
Yep, which is beneficial on its own. But not a reason not to do explicit metadata :).
Right, explicit metadata is absolutely the end goal. My concern is about adding more random cruft that we’ll have to support forever on the path to that.
Going back to the development requirement problem, I think it would be reasonable for setuptools to start to gain some of the concepts from PEP 426, it already has tests_requires and I think that an official dev_requires wouldn’t be a bad idea either. If it then exposed those things as something pip could inspect we could start doing things like automatically installing them inside of a development installation. This would probably even allow backwards compat by having a setup.py dynamically add things to the setup_requires based upon what version of setuptools is executing the setup.py. If it’s an older one, add a shim that’ll implement the new functionality as a plugin instead of part of core.
Ok, so what I need to know is what: - can I do - that solves my problem (not the other 1000 problems that PEP-426 solves) - that the setuptools // pip maintainers would be willing to merge.
I'm happy to put tuits into this, but I don't want to boil the ocean - I want to solve the specific thing that makes me curse my laptop screen on a regular basis.
I’ll come back to this if you can answer the above about how a setup.cfg solves the ``python setup.py …`` situation from my first paragraph, because that’s going to influence my thoughts. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On Mar 16, 2015, at 7:03 PM, Nick Coghlan
wrote: On 17 Mar 2015 02:33, "Daniel Holth"
mailto:dholth@gmail.com> wrote: Problem: Users would like to be able to import stuff in setup.py. This could be anything from a version fetcher to a replacement for distutils itself. However, if setup.py is the only place to specify these requirements there's a bit of a chicken and egg problem, unless they have unusually good setuptools knowledge, especially if you want to replace the entire setup() implementation.
Problem: Having easy_install do it is not what people want and misses some important use cases.
Problem: Based on empirical evidence PEP 426 will never be done. Its current purpose is to shut down discussion of pragmatic solutions.
Slight correction here: one of my current aims with PEP 426 is deliberately discouraging the discussion of solutions that only work reliably if everyone switches to a new build system first. That's a) never going to happen; and b) one of the key mistakes the distutils2 folks made that significantly hindered adoption of their work, and I don't want us to repeat it.
My other key aim is to provide a public definition of what I think "good" looks like when it comes to software distribution, so I can more easily assess whether less radical proposals are still moving us closer to that goal.
Making pip (and perhaps easy_install) setup.cfg aware, such that it assumes the use of d2to1 (or a semantically equivalent tool) if setup.cfg is present and hence is able to skip invoking setup.py in relevant cases, sounds like just such a positive incremental step to me, as it increases the number of situations where pip can avoid executing a Turing complete "configuration" file, without impeding the eventual adoption of a more comprehensive solution.
I don't think that needs a PEP - just an RFE against pip to make it d2to1 aware for each use case where it's relevant, like installing setup.py dependencies. (And perhaps a similar RFE against setuptools)
Projects that choose to rely on that new feature will be setting a high minimum installer version for their users, but some projects will be OK with that (especially projects private to a single organisation after upgrading pip on their production systems).
Cheers, Nick.
I don’t think that’s going to work, because if you only make pip aware of it then you break ``python setup.py sdist``, if you make setuptools aware of it then you don’t need pip to be aware of it because we’ll get it for free from setuptools being aware of it. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On 17 March 2015 at 12:32, Donald Stufft
On Mar 16, 2015, at 7:03 PM, Nick Coghlan
wrote: On 17 Mar 2015 02:33, "Daniel Holth"
wrote: Problem: Users would like to be able to import stuff in setup.py. This could be anything from a version fetcher to a replacement for distutils itself. However, if setup.py is the only place to specify these requirements there's a bit of a chicken and egg problem, unless they have unusually good setuptools knowledge, especially if you want to replace the entire setup() implementation.
Problem: Having easy_install do it is not what people want and misses some important use cases.
Problem: Based on empirical evidence PEP 426 will never be done. Its current purpose is to shut down discussion of pragmatic solutions.
Slight correction here: one of my current aims with PEP 426 is deliberately discouraging the discussion of solutions that only work reliably if everyone switches to a new build system first. That's a) never going to happen; and b) one of the key mistakes the distutils2 folks made that significantly hindered adoption of their work, and I don't want us to repeat it.
My other key aim is to provide a public definition of what I think "good" looks like when it comes to software distribution, so I can more easily assess whether less radical proposals are still moving us closer to that goal.
Making pip (and perhaps easy_install) setup.cfg aware, such that it assumes the use of d2to1 (or a semantically equivalent tool) if setup.cfg is present and hence is able to skip invoking setup.py in relevant cases, sounds like just such a positive incremental step to me, as it increases the number of situations where pip can avoid executing a Turing complete "configuration" file, without impeding the eventual adoption of a more comprehensive solution.
I don't think that needs a PEP - just an RFE against pip to make it d2to1 aware for each use case where it's relevant, like installing setup.py dependencies. (And perhaps a similar RFE against setuptools)
Projects that choose to rely on that new feature will be setting a high minimum installer version for their users, but some projects will be OK with that (especially projects private to a single organisation after upgrading pip on their production systems).
Cheers, Nick.
I don’t think that’s going to work, because if you only make pip aware of it then you break ``python setup.py sdist``, if you make setuptools aware of it then you don’t need pip to be aware of it because we’ll get it for free from setuptools being aware of it.
Huh?
I think the key tests are:
- what happens with old tools
- what happens with new tools
With old tools it needs to not-break.
With new tools it should be better :).
Teaching pip, double-entered setup_requires (.cfg and .py).
old tools keep working
new tools are shiny (pip install -e / vcs then setup's easy_install
call short-circuits doing nothing).
Teaching only setuptools, double-entered
old tools keep working
new tools are not shiny, because pip isn't doing the install
Teaching only setuptools, single entry
old tools break (requirements absent, or you have a versioned dep on
setuptools in setup.py and omg the pain)
new tools are not shiny, same reason
Teaching setuptools and pip, single entry
old tools break - as above
new tools are shiny (because pip either asks setuptools or reads
setup.cfg, whatever)
So I think we must teach pip, and we may teach setuptools.
-Rob
--
Robert Collins
Robert: is it a requirement to you that "python setup.py ..." should
install setup_requires? For me I'd be quite happy if installing the
requirements was my own problem in the absence of an installer.
I would like to start writing my setup.py like this:
setup.cfg:
setup-requires = waf
setup.py:
import waf
interpret setup.py arguments
build with waf
don't import setuptools
On Mon, Mar 16, 2015 at 7:39 PM, Robert Collins
On 17 March 2015 at 12:32, Donald Stufft
wrote: On Mar 16, 2015, at 7:03 PM, Nick Coghlan
wrote: On 17 Mar 2015 02:33, "Daniel Holth"
wrote: Problem: Users would like to be able to import stuff in setup.py. This could be anything from a version fetcher to a replacement for distutils itself. However, if setup.py is the only place to specify these requirements there's a bit of a chicken and egg problem, unless they have unusually good setuptools knowledge, especially if you want to replace the entire setup() implementation.
Problem: Having easy_install do it is not what people want and misses some important use cases.
Problem: Based on empirical evidence PEP 426 will never be done. Its current purpose is to shut down discussion of pragmatic solutions.
Slight correction here: one of my current aims with PEP 426 is deliberately discouraging the discussion of solutions that only work reliably if everyone switches to a new build system first. That's a) never going to happen; and b) one of the key mistakes the distutils2 folks made that significantly hindered adoption of their work, and I don't want us to repeat it.
My other key aim is to provide a public definition of what I think "good" looks like when it comes to software distribution, so I can more easily assess whether less radical proposals are still moving us closer to that goal.
Making pip (and perhaps easy_install) setup.cfg aware, such that it assumes the use of d2to1 (or a semantically equivalent tool) if setup.cfg is present and hence is able to skip invoking setup.py in relevant cases, sounds like just such a positive incremental step to me, as it increases the number of situations where pip can avoid executing a Turing complete "configuration" file, without impeding the eventual adoption of a more comprehensive solution.
I don't think that needs a PEP - just an RFE against pip to make it d2to1 aware for each use case where it's relevant, like installing setup.py dependencies. (And perhaps a similar RFE against setuptools)
Projects that choose to rely on that new feature will be setting a high minimum installer version for their users, but some projects will be OK with that (especially projects private to a single organisation after upgrading pip on their production systems).
Cheers, Nick.
I don’t think that’s going to work, because if you only make pip aware of it then you break ``python setup.py sdist``, if you make setuptools aware of it then you don’t need pip to be aware of it because we’ll get it for free from setuptools being aware of it.
Huh?
I think the key tests are: - what happens with old tools - what happens with new tools
With old tools it needs to not-break. With new tools it should be better :).
Teaching pip, double-entered setup_requires (.cfg and .py). old tools keep working new tools are shiny (pip install -e / vcs then setup's easy_install call short-circuits doing nothing).
Teaching only setuptools, double-entered old tools keep working new tools are not shiny, because pip isn't doing the install
Teaching only setuptools, single entry old tools break (requirements absent, or you have a versioned dep on setuptools in setup.py and omg the pain) new tools are not shiny, same reason
Teaching setuptools and pip, single entry old tools break - as above new tools are shiny (because pip either asks setuptools or reads setup.cfg, whatever)
So I think we must teach pip, and we may teach setuptools.
-Rob
-- Robert Collins
Distinguished Technologist HP Converged Cloud _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
On 17 March 2015 at 12:53, Daniel Holth
Robert: is it a requirement to you that "python setup.py ..." should install setup_requires? For me I'd be quite happy if installing the requirements was my own problem in the absence of an installer.
I would like to start writing my setup.py like this:
setup.cfg: setup-requires = waf
setup.py: import waf interpret setup.py arguments build with waf don't import setuptools
I've no particular thoughts on that. It would certainly avoid the pain
of easy_install being triggered.
Success criteria for my immediate personal needs:
- pip install -e . works on a clean checkout of my projects
- easy_install doesn't go and download stuff
- my setup.py can refer to things (usually the version) inside the
project itself, safely
- python setup.py sdist bdist_wheel upload -s works after I've done
pip install -e .
(I'm happy to unpack why I've chosen those four things, but I think
there's enough context in the thread by now).
-Rob
--
Robert Collins
On 17 March 2015 at 12:39, Robert Collins
I don’t think that’s going to work, because if you only make pip aware of it then you break ``python setup.py sdist``, if you make setuptools aware of it then you don’t need pip to be aware of it because we’ll get it for free from setuptools being aware of it.
Huh?
I think the key tests are: - what happens with old tools - what happens with new tools
With old tools it needs to not-break. With new tools it should be better :).
Teaching pip, double-entered setup_requires (.cfg and .py). old tools keep working new tools are shiny (pip install -e / vcs then setup's easy_install call short-circuits doing nothing).
Teaching only setuptools, double-entered old tools keep working new tools are not shiny, because pip isn't doing the install
Teaching only setuptools, single entry old tools break (requirements absent, or you have a versioned dep on setuptools in setup.py and omg the pain) new tools are not shiny, same reason
Teaching setuptools and pip, single entry old tools break - as above new tools are shiny (because pip either asks setuptools or reads setup.cfg, whatever)
So I think we must teach pip, and we may teach setuptools.
/me puts on the dunce hat. What I forgot was that as soon as we
cleanup the hacks in setup.py's, that they will break. Duh.
OTOH thinking about it more - Donald and I had a brief hangout to get
more bandwidth on the problem - not breaking older setuptools seems an
unnecessarily high bar:
- distutils never knew how to install software, so a setup.py that
doesn't know how to do that is no worse than distutils based
setup.py's
- anyone running a current pip will have things work
- anyone running buildout or debian/rpm package builds etc won't care
because they don't want easy_install triggered anyway, and explicitly
gather deps themselves.
- for anyone running pip behind firewalls etc it will be no worse
(because the chain into easy_install is already broken)
The arguably common case of folk not behind firewalls, running a
slightly not-latest pip would be affected. But - its not deeply
affected, and pip is upgrade-able everywhere :).
More to the point, the choice here will be authors to opt-in knowing
that potential impact. Folk can of course keep the horror in place and
just use the new thing to make development nicer.
So, the propsed plan going forward:
- now:
- I will put a minimal patch up for pip into the tracker and ask
for feedback here and there
- we can debate at that point whether bits of it should be in
setuptools or not etc etc
- likewise we can debate the use of a temporary environment or
install into the target environment at that point
- future
- in the metabuild thing that is planned long term, handling this
particular option will be placed w/in the setuptools plugin for it,
making this not something that needs to be a 'standard'.
-Rob
--
Robert Collins
On 17 March 2015 at 09:24, Nick Coghlan
The main bottleneck where PEP 426 is concerned is me, and my current focus is on Red Hat & Project Atomic (e.g. http://connect.redhat.com/zones/containers, https://github.com/projectatomic/adb-atomic-developer-bundle) and the PSF (e.g. https://wiki.python.org/moin/PythonSoftwareFoundation/ProposalsForDiscussion...)
I'll add in a couple of other relevant links regarding my current "not PyPA" priorities: https://forum.futurewise.org.au/t/what-role-should-foss-take-in-government/2... & http://community.redhat.com/blog/2015/02/the-quid-pro-quo-of-open-infrastruc... We're helping to change the world by participating in pretty much any open source related activity, folks, even if it may not always feel like it :)
The main issue with PEP 426 is that are a few details in the current draft that I already think are a bad idea but haven't explicitly documented anywhere, so I need to get back and address those before it makes sense for anyone to start serious work on implementing it.
Ah, I *thought* I'd filed issues for all of them, I just forgot we were partway through migrating the draft PEPs repo to the PyPA org on GitHub. Full set of currently open PEP 426 issues: * https://bitbucket.org/pypa/pypi-metadata-formats/issues?status=new&status=open&component=Metadata%202.x * https://github.com/pypa/interoperability-peps/labels/PEP%20426 So I guess "finish migrating the draft PEPs and also migrate the open issues" qualifies as PEP 426/459 work that needs to be done. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On 16 March 2015 at 23:24, Nick Coghlan
However, now that I know folks are keen to help with that side, I can reprioritise getting the updates done so there's a better base to start working from.
The thing I struggle with over PEP 426 is that as a data format definition, which explicitly describes itself as defining an in-memory representation, it's not clear to me what coding tasks are needed. (I don't have enough experience with complex build environments to help with the specification of the metadata, so my interest is in coding tasks I can help with). Writing pydist.json files is explicitly deferred to the yet to be written Sdist 1.0, Wheel 1.1, and new distribution database PEPs. The build system interface remains obscure (although there's note of a further PEP to define the command line interface) because we don't want to mandate the setup.py interface, but nobody has yet to come up with a better idea. And there's not even a mention of a PEP covering something like setup.cfg as a declarative means of providing metadata into the build system (unless that comes under the "command line API" description). So we're left in a situation where there are people willing to help, at least with the coding tasks, but no obvious implementation tasks to work on. And yet people still see problems that we expect to be fixed by "Metadata 2.0". So it does feel somewhat like a block on progress. While I understand that there are real reasons why PEP 426 needs more work before being finalised, is there no way that you (Nick and Donald mainly, but honestly anyone with a picture of what a world where PEP 426 is implemented would look like) can list some well-defined implementation tasks that people can *get on with*? The current frustration seems to me to be less about PEP 426 blocking progress, as about nobody knowing how they can actually help (as opposed to things like the current debate, which seems to be rooted in the idea that while PEP 426 should "solve" Robert's issue, nobody knows how that solution will look, and what can be done to get there). Paul
On Mar 17, 2015, at 5:52 AM, Paul Moore
wrote: On 16 March 2015 at 23:24, Nick Coghlan
wrote: However, now that I know folks are keen to help with that side, I can reprioritise getting the updates done so there's a better base to start working from.
The thing I struggle with over PEP 426 is that as a data format definition, which explicitly describes itself as defining an in-memory representation, it's not clear to me what coding tasks are needed. (I don't have enough experience with complex build environments to help with the specification of the metadata, so my interest is in coding tasks I can help with).
Writing pydist.json files is explicitly deferred to the yet to be written Sdist 1.0, Wheel 1.1, and new distribution database PEPs.
The build system interface remains obscure (although there's note of a further PEP to define the command line interface) because we don't want to mandate the setup.py interface, but nobody has yet to come up with a better idea. And there's not even a mention of a PEP covering something like setup.cfg as a declarative means of providing metadata into the build system (unless that comes under the "command line API" description).
So we're left in a situation where there are people willing to help, at least with the coding tasks, but no obvious implementation tasks to work on. And yet people still see problems that we expect to be fixed by "Metadata 2.0". So it does feel somewhat like a block on progress.
While I understand that there are real reasons why PEP 426 needs more work before being finalised, is there no way that you (Nick and Donald mainly, but honestly anyone with a picture of what a world where PEP 426 is implemented would look like) can list some well-defined implementation tasks that people can *get on with*? The current frustration seems to me to be less about PEP 426 blocking progress, as about nobody knowing how they can actually help (as opposed to things like the current debate, which seems to be rooted in the idea that while PEP 426 should "solve" Robert's issue, nobody knows how that solution will look, and what can be done to get there).
Paul
I would just implement it inside of Wheel. You’d technically be working on two PEPs at once, but I think the bare bones Wheel PEP is pretty simple. “All the same things as the last PEP, except with pydist.json”. More things could be added to the Wheel PEP of course, but that’s not related to PEP 426. Even if we don’t merge the Wheel parts (though we’d be able to merge the packaging parts for an in memory representation) immediately, it’d still give some idea about how well it’ll work. Using Wheel is a good target because that already uses a static file so it’s less of a change to get PEP 426 integrated there. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On 17 March 2015 at 19:52, Paul Moore
On 16 March 2015 at 23:24, Nick Coghlan
wrote: However, now that I know folks are keen to help with that side, I can reprioritise getting the updates done so there's a better base to start working from.
The thing I struggle with over PEP 426 is that as a data format definition, which explicitly describes itself as defining an in-memory representation, it's not clear to me what coding tasks are needed. (I don't have enough experience with complex build environments to help with the specification of the metadata, so my interest is in coding tasks I can help with).
That's a fair request. Something I perhaps haven't been clear about is that PEP 426 is *just* an interoperability spec - it defines the interfaces between tools so they can share a common data model. It deliberately *doesn't* fully define the user experience that build tools and download tools are going to wrap around it (it places *constraints* on that experience through some of its clauses, but still falls long way short of actually defining it). The idea here is to allow developers to choose the build system *they* like, and then have PyPI, pip, easy_install, et al, all be able to happily consume that metadata, regardless of the specific build system used. That's an easier bar to meet for wheel files than it is for sdist's, but the long term aim is to achieve it for both.
Writing pydist.json files is explicitly deferred to the yet to be written Sdist 1.0, Wheel 1.1, and new distribution database PEPs.
That's the timeline for the formal definition, in practice "drop pydist.json in the existing directory" as PEP 426 suggests is a fairly safe bet as to what those specs are going to say. The "don't rely on it unless the container version says it's OK to do so" caution in the PEP is primarily because wheels already ship with pydist.json metadata emitted based on an earlier draft version of the spec (from before I gutted it and moved the optional sections out to PEP 459 as standard extension modules, which is also the change that did the most damage when it came to invalidating the current jsonschema definition). On the generation side, there's thinking about how the d2to1 (i.e. setup.cfg) and setuptools (i.e. setup.py) APIs for entering the new metadata might look, taking into account that existing input files need to continue to work, and existing output files need to continue to be generated. Most of that can be worked through even while some specific field names are still being tinkered with. One option on that front may be to propose introducing a setup.yaml file as the preferred human-facing format that both d2to1 and setuptools understand, as I suspect some of the concepts in PEP 426 are going to be hellishly awkward to map to the ini-style syntax of setup.cfg (in particular, the nested format for defining conditional dependencies based on extras and environment markets). One particularly valuable capability is being able to take an existing project and generating pydist.json for it based on the existing metadata files without needing to change anything. https://www.python.org/dev/peps/pep-0426/#appendix-a-conversion-notes-for-le... covers the current partial implementations of that feature which all need updating to account for subsequent changes to the spec. One of the overarching goals here is to be able to publish this info as static metadata on PyPI so folks can more easily do full dependency network analysis without having to download all of PyPI. Vinay built a system like that based on distlib (by first downloading all of PyPI), and one of the ideas behind PEP 426 is to let us publish that kind of info for easier analysis even before we upgrade the actual installers to make use of it. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On 17 March 2015 at 11:05, Donald Stufft
I would just implement it inside of Wheel. You’d technically be working on two PEPs at once, but I think the bare bones Wheel PEP is pretty simple. “All the same things as the last PEP, except with pydist.json”. More things could be added to the Wheel PEP of course, but that’s not related to PEP 426. Even if we don’t merge the Wheel parts (though we’d be able to merge the packaging parts for an in memory representation) immediately, it’d still give some idea about how well it’ll work. Using Wheel is a good target because that already uses a static file so it’s less of a change to get PEP 426 integrated there.
OK, cool. So bdist_wheel to write a pydist.json into the wheel, and then I guess wheel install (and pip) will just pick it up and dump it in the dist-info folder because they do that anyway. Sounds easy enough. Which only leaves the question of how users specify the metadata. My feeling is that anything that isn't already covered by arguments to setup() should be specified declaratively. That may be in setup.cfg, but ini format may be a PITA for some of the more structured data. I can have a think about that... Thanks, I'll work on this. What it means in practice will be that projects wanting to specify Metadata 2.0 data will be able to do so if they build wheels. Nothing will use that metadata, but that's OK. My aim was to pick off an easy target and look like I was helping without having to do any of the hard jobs ;-) Paul
On Mar 17, 2015, at 7:33 AM, Paul Moore
wrote: On 17 March 2015 at 11:05, Donald Stufft
wrote: I would just implement it inside of Wheel. You’d technically be working on two PEPs at once, but I think the bare bones Wheel PEP is pretty simple. “All the same things as the last PEP, except with pydist.json”. More things could be added to the Wheel PEP of course, but that’s not related to PEP 426. Even if we don’t merge the Wheel parts (though we’d be able to merge the packaging parts for an in memory representation) immediately, it’d still give some idea about how well it’ll work. Using Wheel is a good target because that already uses a static file so it’s less of a change to get PEP 426 integrated there.
OK, cool. So bdist_wheel to write a pydist.json into the wheel, and then I guess wheel install (and pip) will just pick it up and dump it in the dist-info folder because they do that anyway. Sounds easy enough.
I would also modify pip to start using it as part of the validation of this PEP (inside of a PR). That should “close the gap” and say “hey look we have a Proof of Concept here of this all working”.
Which only leaves the question of how users specify the metadata. My feeling is that anything that isn't already covered by arguments to setup() should be specified declaratively. That may be in setup.cfg, but ini format may be a PITA for some of the more structured data. I can have a think about that…
I would personally declare it inside of setup.py like everything else, yea setup.py is gross and unfun in 2015, however I think having two different locations for metadata inside of setuptools based on what era of spec that metadata came from is going to be super confusing for end users. What PEP 426 + The Yet to be Done Wheel Spec To Update would mean is that people can use something *other* than setuptools as their build tool for building Wheels. PEP 426 + The Yet to be Done Sdist 2.0 Spec starts paving the way for the same thing in source distributions.
Thanks, I'll work on this. What it means in practice will be that projects wanting to specify Metadata 2.0 data will be able to do so if they build wheels. Nothing will use that metadata, but that's OK. My aim was to pick off an easy target and look like I was helping without having to do any of the hard jobs ;-)
Paul
--- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On 17 March 2015 at 11:30, Nick Coghlan
That's a fair request. Something I perhaps haven't been clear about is that PEP 426 is *just* an interoperability spec - it defines the interfaces between tools so they can share a common data model.
I think that was pretty clear actually. The problem is that as an interoperability spec, it shouldn't really be blocking any work going on, and yet it does seem to - setup_requires, postinstall scripts, things like that get stalled by "when Metadata 2.0 is signed off". In reality, even a *draft* of Metadata 2.0 should be making stuff like that easier. People wanting to implement the actual behaviour should be able to write their code in terms of "some API to get the Metadata 2.0 data" and then go from there. We can add any kind of hack we want for now to provide that API, safe in the knowledge that later we can change that API without invalidating the feature. It's just that that isn't really happening at the moment. Maybe another thing to work on is a basic "get_metadata" API in the packaging library, to be that to-be-improved hack? Paul
On 17 March 2015 at 11:38, Donald Stufft
OK, cool. So bdist_wheel to write a pydist.json into the wheel, and then I guess wheel install (and pip) will just pick it up and dump it in the dist-info folder because they do that anyway. Sounds easy enough.
I would also modify pip to start using it as part of the validation of this PEP (inside of a PR). That should “close the gap” and say “hey look we have a Proof of Concept here of this all working”.
I'm still not clear what you expect pip to *do* with the metadata. It's just data, there's no functionality specified in the PEP.
Which only leaves the question of how users specify the metadata. My feeling is that anything that isn't already covered by arguments to setup() should be specified declaratively. That may be in setup.cfg, but ini format may be a PITA for some of the more structured data. I can have a think about that…
I would personally declare it inside of setup.py like everything else, yea setup.py is gross and unfun in 2015, however I think having two different locations for metadata inside of setuptools based on what era of spec that metadata came from is going to be super confusing for end users.
OK, that makes sense. But that involves setuptools hacking, which I'm not touching with a bargepole :-) Paul
On 17 March 2015 at 21:05, Donald Stufft
On Mar 17, 2015, at 5:52 AM, Paul Moore
wrote: While I understand that there are real reasons why PEP 426 needs more work before being finalised, is there no way that you (Nick and Donald mainly, but honestly anyone with a picture of what a world where PEP 426 is implemented would look like) can list some well-defined implementation tasks that people can *get on with*? The current frustration seems to me to be less about PEP 426 blocking progress, as about nobody knowing how they can actually help (as opposed to things like the current debate, which seems to be rooted in the idea that while PEP 426 should "solve" Robert's issue, nobody knows how that solution will look, and what can be done to get there). Paul
I would just implement it inside of Wheel. You’d technically be working on two PEPs at once, but I think the bare bones Wheel PEP is pretty simple. “All the same things as the last PEP, except with pydist.json”. More things could be added to the Wheel PEP of course, but that’s not related to PEP 426. Even if we don’t merge the Wheel parts (though we’d be able to merge the packaging parts for an in memory representation) immediately, it’d still give some idea about how well it’ll work. Using Wheel is a good target because that already uses a static file so it’s less of a change to get PEP 426 integrated there.
+1 from me. As noted in my other message, I believe the machinery to generate it is actually already there, it's just generating it using the old format from before I moved the optional parts into extensions instead. It's actually possible we could adopt a multi-phase approach to rolling out PEP 426, such as: Phase 0 (today): wheel generates various iterations of draft pydist.json files in wheel 1.0 format files Phase 1: PEP 426 is declared provisional (Oops, I still need to propose that update to PEP 1...) Phase 2: PyPI extracts and publishes the still-provisional PEP 426 metadata from uploaded wheel files Phase 3: We define wheel 1.1 to include pydist.json At this point, we'll have mostly exercised the backwards compatibility in pydist.json, rather than the new stuff. I don't have a clear view as to how the adoption of the *new* capabilities will work, as we get the old chicken-and-egg problem of needing to update the build side and the install side at the same time to really gain from them. One possible way to go would be to have the initial pydist.json consumers be redistribution tools like pyp2rpm, while pip continues to rely solely on the old metadata files. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On Mar 17, 2015, at 7:43 AM, Paul Moore
wrote: On 17 March 2015 at 11:38, Donald Stufft
wrote: OK, cool. So bdist_wheel to write a pydist.json into the wheel, and then I guess wheel install (and pip) will just pick it up and dump it in the dist-info folder because they do that anyway. Sounds easy enough.
I would also modify pip to start using it as part of the validation of this PEP (inside of a PR). That should “close the gap” and say “hey look we have a Proof of Concept here of this all working”.
I'm still not clear what you expect pip to *do* with the metadata. It's just data, there's no functionality specified in the PEP.
What pip does now with metadata, Look at it for dependency information when installing the Wheel, show it when doing ``pip show``, handle the Provides metadata making something “Provide” something else, show warnings for the obsoleted-by metadata, handle extensions (including failing if there is a critical extension we don’t understand). I’m not actually sure what you mean by “there’s no functionality specified in the PEP” because there is quite a bit of new functionality that’s implicitly inside the PEP just from the new types of data it includes.
Which only leaves the question of how users specify the metadata. My feeling is that anything that isn't already covered by arguments to setup() should be specified declaratively. That may be in setup.cfg, but ini format may be a PITA for some of the more structured data. I can have a think about that…
I would personally declare it inside of setup.py like everything else, yea setup.py is gross and unfun in 2015, however I think having two different locations for metadata inside of setuptools based on what era of spec that metadata came from is going to be super confusing for end users.
OK, that makes sense. But that involves setuptools hacking, which I'm not touching with a bargepole :-)
Ha, that makes sense :) For a proof of concept doing whatever makes sense to you in that regards makes sense too FWIW.
Paul
--- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On 17 March 2015 at 11:49, Donald Stufft
I'm still not clear what you expect pip to *do* with the metadata. It's just data, there's no functionality specified in the PEP.
What pip does now with metadata, Look at it for dependency information when installing the Wheel, show it when doing ``pip show``, handle the Provides metadata making something “Provide” something else, show warnings for the obsoleted-by metadata, handle extensions (including failing if there is a critical extension we don’t understand).
Hmm, OK. At the moment that stuff (except pip show) is all covered by the running of the egg_info command, I guess. So you're saying that pip should first check if a requirement has new-style metadata and if it does, skip the egg_info command and use pydist.json. I guess that would be good - it'd solve the problems we see with numpy-related packages that need things installed just to run setup.py egg_info. It wasn't something I'd particularly considered, but thanks for the clarification. Paul
On Mar 17, 2015, at 8:33 AM, Paul Moore
wrote: On 17 March 2015 at 11:49, Donald Stufft
wrote: I'm still not clear what you expect pip to *do* with the metadata. It's just data, there's no functionality specified in the PEP.
What pip does now with metadata, Look at it for dependency information when installing the Wheel, show it when doing ``pip show``, handle the Provides metadata making something “Provide” something else, show warnings for the obsoleted-by metadata, handle extensions (including failing if there is a critical extension we don’t understand).
Hmm, OK.
At the moment that stuff (except pip show) is all covered by the running of the egg_info command, I guess. So you're saying that pip should first check if a requirement has new-style metadata and if it does, skip the egg_info command and use pydist.json. I guess that would be good - it'd solve the problems we see with numpy-related packages that need things installed just to run setup.py egg_info.
It wasn't something I'd particularly considered, but thanks for the clarification.
Paul
There is no egg_info command inside of a Wheel, it’s currently looking at foo.whl/foo.dist-info/METADATA for that. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On 17 March 2015 at 12:34, Donald Stufft
There is no egg_info command inside of a Wheel, it’s currently looking at foo.whl/foo.dist-info/METADATA for that.
Doh, of course. I thought I'd checked that, must have been looking in the wrong place (it's not a bit of the code I'm that familiar with). Paul
The wheel spec itself is intentionally designed to be ignorant of the (setuptools) metadata; you don't have to read that to install the files where they belong. Wheels are also not too hard to generate without-setuptools. There's a wscript in the wheel source code that can generate a wheel of bdist_wheel using waf. wwwwwwwwwww. There's also an old patch that allows the Bento build system to generate wheels. Of course wheel currently works by converting all the setuptools static metadata from the .egg-info directory into a different format in the .dist-info directory, after setuptools is done running. bdist_wheel generates PEP 426 data here: https://bitbucket.org/pypa/wheel/src/bdf053a70200c5857c250c2044a2d91da23db4a... All the files setup.py currently dumps into the .egg-info directory are generated by setuptools plugins. It would be neat to pull the dist-info generation out of wheel and put it in one of these plugins for setuptools. Once the plugin was installed, every setuptools package would automatically get the new file. However IIRC wheel may have needed one value that's hard to get at this point in the execution. Alias the egg-info command to dist-info; have it generate a .dist-info directory; make sure setuptools treats .dist-info about the same as .egg-info even in a source checkout. https://bitbucket.org/pypa/setuptools/src/31b56862b41ce24ffe5e28434b98fa35f3... https://bitbucket.org/pypa/setuptools/src/31b56862b41ce24ffe5e28434b98fa35f3... Recall that setup.py is still moderately OK as a build script. It is legitimate to need software to build software. We just want the metadata it generates to always be the same, and for it to not also be the installer. setup-requires solves a different problem than pydist.json. You should be able to use setup-requires in a source checkout even if you don't have a (complete) pydist.json, install the build system; run the metadata generation phase of your build system to convert some metadata "in a file format that can have comments" to the json file; continue. On Tue, Mar 17, 2015 at 8:33 AM, Paul Moore
On 17 March 2015 at 11:49, Donald Stufft
wrote: I'm still not clear what you expect pip to *do* with the metadata. It's just data, there's no functionality specified in the PEP.
What pip does now with metadata, Look at it for dependency information when installing the Wheel, show it when doing ``pip show``, handle the Provides metadata making something “Provide” something else, show warnings for the obsoleted-by metadata, handle extensions (including failing if there is a critical extension we don’t understand).
Hmm, OK.
At the moment that stuff (except pip show) is all covered by the running of the egg_info command, I guess. So you're saying that pip should first check if a requirement has new-style metadata and if it does, skip the egg_info command and use pydist.json. I guess that would be good - it'd solve the problems we see with numpy-related packages that need things installed just to run setup.py egg_info.
It wasn't something I'd particularly considered, but thanks for the clarification.
Paul _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
So for me, metadata, while fine to have, is not the blocker for "generating wheels with some other build system". Having the other build system is the blocker. Still Bento is a pretty great candidate. WAF is also promising.
On 17 Mar 2015 23:01, "Daniel Holth"
So for me, metadata, while fine to have, is not the blocker for "generating wheels with some other build system". Having the other build system is the blocker.
For me, it's not knowing what "done" looks like, even if a candidate alternative build system was available. I know pip doesn't need the whole setuptools feature set to generate wheels, but I don't know what subset it actually uses, nor what changes when cross compiling C extensions in Linux. As far as I'm aware, *nobody* actually knows the answer to that right now, so figuring it out will likely involve some code archaeology.
Still Bento is a pretty great candidate. WAF is also promising.
Agreed. This is where I think making pip d2to1 aware could be worthwhile - if we can find a of making the emulation target for an alternate build system be d2to1 rather than the whole of setuptools it may descope the interoperability problem enough to make it easier to get started on something practical. Cheers, Nick.
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
On Tue, Mar 17, 2015 at 9:17 AM, Nick Coghlan
On 17 Mar 2015 23:01, "Daniel Holth"
wrote: So for me, metadata, while fine to have, is not the blocker for "generating wheels with some other build system". Having the other build system is the blocker.
For me, it's not knowing what "done" looks like, even if a candidate alternative build system was available. I know pip doesn't need the whole setuptools feature set to generate wheels, but I don't know what subset it actually uses, nor what changes when cross compiling C extensions in Linux.
As far as I'm aware, *nobody* actually knows the answer to that right now, so figuring it out will likely involve some code archaeology.
Yes, it's pretty obvious what "build a wheel with everything set to default" should look like, but I can't immediately envision "pass these standardized arguments if you need to compile a wheel for ARM while running on x86". :(
On 17 March 2015 at 13:17, Nick Coghlan
Agreed. This is where I think making pip d2to1 aware could be worthwhile - if we can find a of making the emulation target for an alternate build system be d2to1 rather than the whole of setuptools it may descope the interoperability problem enough to make it easier to get started on something practical.
Could you clarify what you mean by this? I'm not sure what awareness of d2to1 is needed (I assume you're talking about the PyPI package here). Ignoring a whole lot of unpleasant details, pip's interface for building wheels is roughly '"python setup.py bdist_wheel" needs to work'. Any build system you like can implement that... (Of course the details are why you said "I know pip doesn't need the whole setuptools feature set to generate wheels, but I don't know what subset it actually uses", I understand that...) Paul
For instance, if the problem is "when setuptools does the install, then things get installed differently, with different options, SSL certs, proxies, etc" then I think a better solution is that pip does terrible hacks in order to forcibly take control of setup_requires from setuptools and installs them into a temporary directory (or something like that). That is something that would require no changes on the part of authors or people installing software, and is backwards compatible with everything that's already been published using setup_requires.
Donald, could you add a pip issue for the "forcibly take control" idea (if we don't have one already?) this comes up a fair amount, and it would be nice to be able to link to this.
In other setup_requires old news, a couple of years ago I did an
"autosetuptools" branch of pip which would automatically install
setuptools (if it was not already installed) when installing sdists.
In this case you could think of setuptools as an implicit
setup_requires member. Setuptools would not be installed if only
wheels were being installed.
It might be helpful to think of setuptools-style setup_requires
differently than "must be available before setup.py can run at all"
setup_requires.
On Tue, Mar 17, 2015 at 11:34 AM, Marcus Smith
So you *can* import things inside of a setup.py today, you just have to....
I think it's time for the Packaging User Guide to try to cover "setup_requires"...
On 03/16/2015 02:53 PM, Daniel Holth wrote:
No one should be asked to learn how to extend distutils, and in practice no one knows how.
People have been begging for years for working setup_requires, far longer than I've been interested in it, and all they want to do is
import fetch_version setup(version=fetch_version(), ...)
Then they will eventually notice setup_requires has never worked the way most people expect. As a result there are too few setup.py abstractions.
FWIW, this particular use case (retrieving the version by importing it or a function that returns it after it reads a file or whatever), is dodgy. It's way better that code that needs version info inside the package consult pkg_resources or some similar system: import pkg_resources version = pkg_resources.get_distribution('mydistro').version I realize there are other use cases that setup_requires solves, and that using pkg_resources can be a performance issue. I also realize that people badly want to be able to "from mypkg import version" or "from mypkg import get_version". But I'd try to come up with a different sample use case for driving decision-making because IMO we should dissuade them from doing that. Python packaging should be able to provide them this information, they should not need to provide it themselves. - C
Folks, I'm having a hard time catching up with this, but maybe a few comments from someone well outside the trenches will be helpful. And note that one of use cases I'm now wrestling with is using another package manager (conda), and setuptools is currently a PITA in that context.
you that "python setup.py ..." should install setup_requires? For me I'd be quite happy if installing the requirements was my own problem in the absence of an installer.
Yes, yes, yes! Separation of concerns is key here -- the tool that builds and/or installs one package shouldn't do ANYTHING to get me dependencies, except maybe warn that they are not there. And raise a sensible error if a build dependency is not there. Particularly for developers, they really are capable of installing dependencies.
I've no particular thoughts on that. It would certainly avoid the pain of easy_install being triggered.
Ahh! Is that why this is so painful? Not only is setuptools trying to install stuff for me, but it's using easy_install to do so? Aargh!
Success criteria for my immediate personal needs: - pip install -e . works on a clean checkout of my projects
Sure.
- easy_install doesn't go and download stuff
easy_install doesn't do anything, ever!
- my setup.py can refer to things (usually the version) inside the project itself, safely
Yeah, that would be nice. A few other notes: If I have this right, this thread, and a number of other issues are triggered by the fact that setup() is not declarative -- i.e. You don't have access to the metadata until it's been run. But maybe we can kludge a declarative interface I top of the existing system. Something like requiring: Setup_params = a_big_dict_of_stuff_to_pass_to_setup setup(**a_big_dict_of_stuff_to_pass_to_setup) Code could look for that big dict before running setup. If it's not there, you don't get any new functionality. Note that I'm wary of a completely declarative system, there ends up being a lot if stuff you don't want to hard-code, so you have to start building up a whole macro-expansion system, etc. I'd much rather simply let the user build up a python data structure however they want -- the default, simple cases would still be basic declarative hard-coding. I suppose it's too late now, but the really painful parts of all this seem to be due to overly aggressive backward compatibility. We now have wheels, but also eggs, we now have pip, but also easy_install, etc. Perhaps it's time to restore "distribute" -- setuptools without the cruft. Folks could choose to use distribute (or maybe setuptools2) instead of setuptools, and not get the cruft. pip would, of course, still need to work with setuptools, and setuptools would have to be maintained, but it would give us a path forward out of the mess. Another issue I don't see a way out of is that the package name that you use to ask for a package, say on pypi, is not necessarily the name of the python package you can import. So it's really tricky to check if a package is installed independently of the package manager at hand. This is the source of my conda issues -- conda installs the dependencies, but setuptools doesn't know that, so it tries to do it again -- ouch. Final note: setuptools has always bugged me, even though it provides some great features. I think all my struggles with it come down to a key issue: it does not make clear distinctions between what should happen at build-time vs install-time vs run-time. For example: I don't want it downloading and installing dependencies when I go to build. That's an install- time task. I don't want it selecting versions at run time--also an install time task. There are others I can't recall -- but a couple years back I was bundling up an application with py2exe and py2app, and found I had to put an enormous amount of cruft in to satisfy setuptools at run time (including setuptools itself) -- it was pretty painful. And of course, using it within another package manager, like conda -- I really want it to build, and only build, I'm taking care of dependencies another way. OK, I've had my rant! -Chris
Just a couple of comments
On 18 March 2015 at 15:33, Chris Barker - NOAA Federal
I suppose it's too late now, but the really painful parts of all this seem to be due to overly aggressive backward compatibility. We now have wheels, but also eggs, we now have pip, but also easy_install, etc.
Agreed. But the problem we have here is that any system that fails to work for even a tiny proportion of packages on PyPI is a major issue. And we don't have *any* control over those packages - if they do the most insane things in their setup.py, and don't release a new version using new tools, we have to support those insane things, or deal with the bug reports. Maybe we should say "sorry, your package needs to change or we won't help", but traditionally the worst packaging arguments have started that way (see, for example, the distribute or distutils2 flamewars). People are much more positive these days, so maybe we could do something along those lines, but it's hard to test that assumption without risking the peace...
Final note: setuptools has always bugged me, even though it provides some great features. I think all my struggles with it come down to a key issue: it does not make clear distinctions between what should happen at build-time vs install-time vs run-time.
Agreed entirely. It's a long slow process though to migrate away from the problems of setuptools without losing the great features at the same time... Thanks for your thoughts! Paul
On Wed, Mar 18, 2015 at 5:33 PM, Chris Barker - NOAA Federal < chris.barker@noaa.gov> wrote:
I don't want it downloading and installing dependencies when I go to build. That's an install- time task.
Sounds to me like you should not use setup_requires then - if you don't like what it does. Also, for the whole distutils-sig, I don't understand all the fuss around this much maligned feature - there are plenty of options to manage build-time dependencies and tasks - one certainly doesn't need to shoehorn https://github.com/ipython/ipython/blob/master/setup.py a full blown build system into setup.py - there's make, invoke, shell scripts and plenty of other systems that can do that just fine. Using too many tools is bad, but misusing tools is far worse. Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro
On 18 March 2015 at 16:02, Ionel Cristian Mărieș
one certainly doesn't need to shoehorn a full blown build system into setup.py - there's make, invoke, shell scripts and plenty of other systems that can do that just fine.
Just to insert a little history here, before distutils (setup.py) was invented, Python packages used all sorts of tools to build. Often shell scripts and/or make were used, which in essence meant that the packages were unusable on Windows - even if there was no need for them to be. Distutils may be bad, but it's still far better than what it replaced :-) Paul
On Wed, Mar 18, 2015 at 12:02 PM, Ionel Cristian Mărieș
On Wed, Mar 18, 2015 at 5:33 PM, Chris Barker - NOAA Federal
wrote: I don't want it downloading and installing dependencies when I go to build. That's an install- time task.
Sounds to me like you should not use setup_requires then - if you don't like what it does.
The behavior we're aiming for would be: "installer run setup.py" - installs things "python setup.py" - does not install things
On 18 Mar 2015, at 17:49, Paul Moore
wrote: On 18 March 2015 at 16:02, Ionel Cristian Mărieș
wrote: one certainly doesn't need to shoehorn a full blown build system into setup.py - there's make, invoke, shell scripts and plenty of other systems that can do that just fine.
Just to insert a little history here, before distutils (setup.py) was invented, Python packages used all sorts of tools to build. Often shell scripts and/or make were used, which in essence meant that the packages were unusable on Windows - even if there was no need for them to be.
For what it’s worth I have C++ Python module which is build using CMake, and the experience has been extremely pleasant. CMake has lots of useful documentation, while trying to figure out how to do OS and package detection to figure out the right compile and link options for distutils is an awful experience and leads to nightmares such as https://github.com/python-pillow/Pillow/blob/master/setup.py or https://github.com/lxml/lxml/blob/master/setup.py Wichert.
Chris McDonough
FWIW, this particular use case (retrieving the version by importing it or a function that returns it after it reads a file or whatever), is dodgy. It's way better that code that needs version info inside the package consult pkg_resources or some similar system:
import pkg_resources version = pkg_resources.get_distribution('mydistro').version
That's all fine once the distribution is *installed*, and I agree ‘pkg_resources’ is appropriate for querying the version of an already-installed Python distribution. But the whole point here (AIUI) is that the ‘setup.py’ is responsible for storing that information in the distribution. And ‘setup.py’ may need to import third-party modules in order to get the version information. For many projects, the version information is best stored in a central place and ‘setup.py’ is just one consumer of many for that information. Getting the version information may itself need distributions installed (e.g. in my case, Docutils).
I realize there are other use cases that setup_requires solves, and that using pkg_resources can be a performance issue.
The issue isn't importing ‘pkg_resources’. The issue is generating the distribution, which ‘pkg_resources’ can't help with.
Python packaging should be able to provide them this information, they should not need to provide it themselves.
Once the distribution is installed: I agree. While generating the distribution – the point where ‘setup_requires’ is meant to help – no, I disagree. We're trying to get information such that it can be fed to Distutils, since Distutils can't know until it's told. -- \ “Nothing exists except atoms and empty space; everything else | `\ is opinion.” —Democritus | _o__) | Ben Finney
On 19 March 2015 at 02:49, Paul Moore
On 18 March 2015 at 16:02, Ionel Cristian Mărieș
wrote: one certainly doesn't need to shoehorn a full blown build system into setup.py - there's make, invoke, shell scripts and plenty of other systems that can do that just fine.
Just to insert a little history here, before distutils (setup.py) was invented, Python packages used all sorts of tools to build. Often shell scripts and/or make were used, which in essence meant that the packages were unusable on Windows - even if there was no need for them to be.
Distutils may be bad, but it's still far better than what it replaced :-)
Aye, without distutils there's no fpm, pyp2rpm, etc. At least Fedora has migrated its packaging policy from invoking setup.py directly to invoking it via pip instead, but that wouldn't be possible without the commitment to make sure that everything that builds today keeps building tomorrow. What's changed in the 16-17 years since distutils was first designed is the rise of open source usage on Windows and Mac OS X clients, together with the desire for development and data analysis focused user level package management on Linux (which major Linux distros currently tend not to provide, although some of us would like to if we can come up with a reasonable approach that keeps the long term sustaining engineering costs under control [1]). The "simpler" packaging systems like npm, etc get to be simpler because they're written for specific problem domains (e.g. public cloud hosted web service development for npm), so it's much easier to cover all the relevant use cases. With setuptools/distutils the use cases are just as sprawling as the use cases for Python itself, so we're trying to cover needs that range from kids tinkering on their Raspberry Pi's, to multinationals operating public cloud infrastructure, to folks writing web services, to scientists doing computational research, to financial analysts, to spooks on air-gapped networks, to industrial control systems, etc, etc, etc. Solving the software distribution problems of any *one* niche is hard enough that you can build large profitable ecosystems on the back of them (this is why platform specific app stores are so popular), but that's a relatively simple and straightforward problem compared to figuring out how to build the backbone infrastructure that lets open source developers learn one set of software distribution tooling themselves, while still being to relatively easily feed into all of the other downstream systems :) Cheers, Nick. [1] https://fedoraproject.org/wiki/Env_and_Stacks/Projects/UserLevelPackageManag... -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On 18 March 2015 at 14:37, Daniel Holth
[...]
The behavior we're aiming for would be:
"installer run setup.py" - installs things "python setup.py" - does not install things
Besides that, I'd add that we're also looking for: "python setup.py" (by itself) should not raise ImportError, even if setup.py needs extra things installed for certain operations (egg_info, build, sdist, develop, install). IMO, the biggest pain point is not people putting crazy stuff in setup.py to get version numbers. For me, the biggest pain point is when setup.py needs to import other packages in order to even know how to build: So I'd like to suggest the following series of small improvements to both pip and setuptools: * setuptools: `python setup.py setup_requires` dumps its setup_requires keyword in 'requirements.txt' format It's is already in this format, so should be trivial, but allows one to do something like: $ python setup.py setup_requires > setup_requires.txt $ pip install -r setup_requires.txt Or in one bash line: $ pip install -r <( python setup.py setup_requires ) * setuptools: setup.py gains the ability to accept callables in most (all?) of its parameters. This will allow people to move all top level setup.py imports into functions, so that we can turn code like this: from setuptools import setup, Extension import numpy setup(ext_modules=[ Extension("_cos_doubles", sources=["cos_doubles.c", "cos_doubles.i"], include_dirs=[numpy.get_include()])]) Into this: from setuptools import setup, Extension def ext_modules(): import numpy return [ Extension("_cos_doubles", sources=["cos_doubles.c", "cos_doubles.i"], include_dirs=[numpy.get_include()]) ] setup(ext_modules=ext_modules setup_requires=['setuptools']) * pip: When working with an sdist, before running "setup.py egg_info" in a sandbox, pip would run "setup.py setup_requires", install those packages in the sandbox (not in the main environment), then run "egg_info", "wheel", etc. Notice that the changes proposed above are all backward compatible, create no additional pain, and allow developers to move all top level setup.py craziness inside functions. After that, we can consider making setup.py not call the easy_install functionality when it finds a setup_requires keyword while running other commands, but just report if those packages are not available. PS: Yes, I've already proposed something similar recently: https://mail.python.org/pipermail/distutils-sig/2015-January/025682.html
If that's what you want then we could say the spec was to put the
requirements in setup_requires.txt, in the requirements.txt format,
which pip would eventually look for and install before running
setup.py
On Thu, Mar 19, 2015 at 9:32 AM, Leonardo Rochael Almeida
On 18 March 2015 at 14:37, Daniel Holth
wrote: [...]
The behavior we're aiming for would be:
"installer run setup.py" - installs things "python setup.py" - does not install things
Besides that, I'd add that we're also looking for: "python setup.py" (by itself) should not raise ImportError, even if setup.py needs extra things installed for certain operations (egg_info, build, sdist, develop, install).
IMO, the biggest pain point is not people putting crazy stuff in setup.py to get version numbers. For me, the biggest pain point is when setup.py needs to import other packages in order to even know how to build:
So I'd like to suggest the following series of small improvements to both pip and setuptools:
* setuptools: `python setup.py setup_requires` dumps its setup_requires keyword in 'requirements.txt' format
It's is already in this format, so should be trivial, but allows one to do something like:
$ python setup.py setup_requires > setup_requires.txt $ pip install -r setup_requires.txt
Or in one bash line:
$ pip install -r <( python setup.py setup_requires )
* setuptools: setup.py gains the ability to accept callables in most (all?) of its parameters.
This will allow people to move all top level setup.py imports into functions, so that we can turn code like this:
from setuptools import setup, Extension import numpy
setup(ext_modules=[ Extension("_cos_doubles", sources=["cos_doubles.c", "cos_doubles.i"], include_dirs=[numpy.get_include()])])
Into this:
from setuptools import setup, Extension
def ext_modules(): import numpy return [ Extension("_cos_doubles", sources=["cos_doubles.c", "cos_doubles.i"], include_dirs=[numpy.get_include()]) ]
setup(ext_modules=ext_modules setup_requires=['setuptools'])
* pip: When working with an sdist, before running "setup.py egg_info" in a sandbox, pip would run "setup.py setup_requires", install those packages in the sandbox (not in the main environment), then run "egg_info", "wheel", etc.
Notice that the changes proposed above are all backward compatible, create no additional pain, and allow developers to move all top level setup.py craziness inside functions.
After that, we can consider making setup.py not call the easy_install functionality when it finds a setup_requires keyword while running other commands, but just report if those packages are not available.
PS: Yes, I've already proposed something similar recently: https://mail.python.org/pipermail/distutils-sig/2015-January/025682.html
... except that there are plenty of reasons we wouldn't want the
requirements.txt format, mainly because pip shouldn't automatically
install concrete dependencies that contain git:// urls etc.
On Thu, Mar 19, 2015 at 9:57 AM, Daniel Holth
If that's what you want then we could say the spec was to put the requirements in setup_requires.txt, in the requirements.txt format, which pip would eventually look for and install before running setup.py
On Thu, Mar 19, 2015 at 9:32 AM, Leonardo Rochael Almeida
wrote: On 18 March 2015 at 14:37, Daniel Holth
wrote: [...]
The behavior we're aiming for would be:
"installer run setup.py" - installs things "python setup.py" - does not install things
Besides that, I'd add that we're also looking for: "python setup.py" (by itself) should not raise ImportError, even if setup.py needs extra things installed for certain operations (egg_info, build, sdist, develop, install).
IMO, the biggest pain point is not people putting crazy stuff in setup.py to get version numbers. For me, the biggest pain point is when setup.py needs to import other packages in order to even know how to build:
So I'd like to suggest the following series of small improvements to both pip and setuptools:
* setuptools: `python setup.py setup_requires` dumps its setup_requires keyword in 'requirements.txt' format
It's is already in this format, so should be trivial, but allows one to do something like:
$ python setup.py setup_requires > setup_requires.txt $ pip install -r setup_requires.txt
Or in one bash line:
$ pip install -r <( python setup.py setup_requires )
* setuptools: setup.py gains the ability to accept callables in most (all?) of its parameters.
This will allow people to move all top level setup.py imports into functions, so that we can turn code like this:
from setuptools import setup, Extension import numpy
setup(ext_modules=[ Extension("_cos_doubles", sources=["cos_doubles.c", "cos_doubles.i"], include_dirs=[numpy.get_include()])])
Into this:
from setuptools import setup, Extension
def ext_modules(): import numpy return [ Extension("_cos_doubles", sources=["cos_doubles.c", "cos_doubles.i"], include_dirs=[numpy.get_include()]) ]
setup(ext_modules=ext_modules setup_requires=['setuptools'])
* pip: When working with an sdist, before running "setup.py egg_info" in a sandbox, pip would run "setup.py setup_requires", install those packages in the sandbox (not in the main environment), then run "egg_info", "wheel", etc.
Notice that the changes proposed above are all backward compatible, create no additional pain, and allow developers to move all top level setup.py craziness inside functions.
After that, we can consider making setup.py not call the easy_install functionality when it finds a setup_requires keyword while running other commands, but just report if those packages are not available.
PS: Yes, I've already proposed something similar recently: https://mail.python.org/pipermail/distutils-sig/2015-January/025682.html
On Wed, Mar 18, 2015 at 9:02 AM, Ionel Cristian Mărieș
On Wed, Mar 18, 2015 at 5:33 PM, Chris Barker - NOAA Federal < chris.barker@noaa.gov> wrote:
I don't want it downloading and installing dependencies when I go to build. That's an install- time task.
Sounds to me like you should not use setup_requires then - if you don't like what it does.
My use case at the moment is trying to build conda packages from other peoples' Python packages - if they use setup_requires, etc, then I'm stuck with it. Also -- for my packages, I want them to be easy to build and deploy by others that aren't using conda -- so I need a way to do that - which would be setuptools' features. So I'd like the features of the "official" python packaging tools to cleanly separated and not assume that if you're using setuptools you are also using pip, etc.... Also, for the whole distutils-sig, I don't understand all the fuss around
this much maligned feature - there are plenty of options to manage build-time dependencies and tasks - one certainly doesn't need to shoehorn https://github.com/ipython/ipython/blob/master/setup.py a full blown build system into setup.py - there's make, invoke, shell scripts and plenty of other systems that can do that just fine.
None of those are cross platform, though. That still may be the way to go. I like to keep in mind that with all this pain, in fact, even raw distutils is freaking awesome at making the easy stuff easy. ( pip and pypi too...) i.e. I can write a simple C extnaion (or Cyton, even), and a very simple boilerplate setup.py will let it build and install on all major platfroms out of the box. Then I put it up on PyPi and anyone can do a "pip install my_package" and away they go. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
On Wed, Mar 18, 2015 at 10:37 AM, Daniel Holth
The behavior we're aiming for would be:
"installer run setup.py" - installs things "python setup.py" - does not install things
yup. Which, now that I look at it, is not so different than: python setup.py build # does not isntall anything python setup.py install # only install the particular package pip install setup.py ( maybe not pass teh setup.py directly, but maybe? ) # uses pip to find and install the dependencies. and could we get there with: python setup.py build --no-deps python setup.py install --no-deps (I'd like the no-deps flag to be the default, but that probably would have to wait for a depreciation period) None of this solves the "how to get meta-data without installing the package" problem -- which I think is what started this thread. For that, it seems the really hacky way to get there is to establish a meta-data standard to be put in setup.py -- a bunch of standard naems to be defined in the module namespace: packge_version = "1.2.3" setup_requires == ["packagea", "packageb>=2.3",] ... (or maybe all in a big dict: package_meta_data = {"package_version": "1.2.3" setup_requires : ["packagea", "packageb>=2.3",] ... } (those values would be passed in to setup() as well, of course) That way, install tools, etc, could import teh setup.py, not run setup, and have access to the meta data. Of course, this would only work with packages that followed the standard, and it would be a long time until it was common, but we've got to have a direction to head to. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
On Thu, Mar 19, 2015 at 7:12 AM, Daniel Holth
... except that there are plenty of reasons we wouldn't want the requirements.txt format, mainly because pip shouldn't automatically install concrete dependencies that contain git:// urls etc.
is that format problem, or a pip feature issue? and this is a one-way street -- setuptools would dump a list of requirements -- would it ever HAVE a git:// url to dump? -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
On Thu, Mar 19, 2015 at 5:38 PM, Chris Barker
My use case at the moment is trying to build conda packages from other peoples' Python packages - if they use setup_requires, etc, then I'm stuck with it.
Worth considering, if you can afford it, to have a local patch that you apply before building. Then you have all the necessary fixes (like remove the setup_requires) in that patch file. This is a popular approach in Debian packages - they can have all kinds of fixes for the upstream code. Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro
On Thu, Mar 19, 2015 at 6:57 AM, Daniel Holth
If that's what you want then we could say the spec was to put the requirements in setup_requires.txt, in the requirements.txt format, which pip would eventually look for and install before running setup.py
yes, that would be great -- and while we are at it, put the run-time dependencies in requirements.txt too. I brought this up a while ago, and it seems that requirements.txt is for applications, and setting install_requires in the setup.py is for package dependencies. But as we've seen, this creates problems -- so why not just keep all the dependency info in an external file??? Though this would not be backward compatible with all the setup.pys out there in the wild now... -Chris
On Thu, Mar 19, 2015 at 9:32 AM, Leonardo Rochael Almeida
wrote: On 18 March 2015 at 14:37, Daniel Holth
wrote: [...]
The behavior we're aiming for would be:
"installer run setup.py" - installs things "python setup.py" - does not install things
Besides that, I'd add that we're also looking for: "python setup.py" (by itself) should not raise ImportError, even if setup.py needs extra things installed for certain operations (egg_info, build, sdist, develop,
install).
IMO, the biggest pain point is not people putting crazy stuff in
setup.py to
get version numbers. For me, the biggest pain point is when setup.py needs to import other packages in order to even know how to build:
So I'd like to suggest the following series of small improvements to both pip and setuptools:
* setuptools: `python setup.py setup_requires` dumps its setup_requires keyword in 'requirements.txt' format
It's is already in this format, so should be trivial, but allows one to do something like:
$ python setup.py setup_requires > setup_requires.txt $ pip install -r setup_requires.txt
Or in one bash line:
$ pip install -r <( python setup.py setup_requires )
* setuptools: setup.py gains the ability to accept callables in most (all?) of its parameters.
This will allow people to move all top level setup.py imports into functions, so that we can turn code like this:
from setuptools import setup, Extension import numpy
setup(ext_modules=[ Extension("_cos_doubles", sources=["cos_doubles.c", "cos_doubles.i"], include_dirs=[numpy.get_include()])])
Into this:
from setuptools import setup, Extension
def ext_modules(): import numpy return [ Extension("_cos_doubles", sources=["cos_doubles.c", "cos_doubles.i"], include_dirs=[numpy.get_include()]) ]
setup(ext_modules=ext_modules setup_requires=['setuptools'])
* pip: When working with an sdist, before running "setup.py egg_info" in a sandbox, pip would run "setup.py setup_requires", install those packages in the sandbox (not in the main environment), then run "egg_info", "wheel", etc.
Notice that the changes proposed above are all backward compatible, create no additional pain, and allow developers to move all top level setup.py craziness inside functions.
After that, we can consider making setup.py not call the easy_install functionality when it finds a setup_requires keyword while running other commands, but just report if those packages are not available.
PS: Yes, I've already proposed something similar recently: https://mail.python.org/pipermail/distutils-sig/2015-January/025682.html
Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
-- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
On Thu, Mar 19, 2015 at 9:12 AM, Ionel Cristian Mărieș
Worth considering, if you can afford it, to have a local patch that you apply before building. Then you have all the necessary fixes (like remove the setup_requires) in that patch file.
yup -- that's a option -- but a really painful one! I did, in fact, find an incantation that works: $PYTHON setup.py install --single-version-externally-managed --record=/tmp/record.txt but boy, is that ugly, and hard to remember why not a --no-deps flag? (and I have no idea what the --record thing is, or if it's even neccessary... -Chris This is a popular approach in Debian packages - they can have all kinds of
fixes for the upstream code.
Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro
-- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
The --record is for making a list of installed files. You don't need it if
you don't use record.txt anywhere.
As for --single-version-externally-managed, that's unrelated to your
setup_requires pain - you probably already have the eggs around, so they
aren't redownloaded. What --single-version-externally-managed does is force
the package to install in non-egg form (as distutils would). That also
means only setup.py that uses setuptools will have the
--single-version-externally-managed option available.
Thanks,
-- Ionel Cristian Mărieș, http://blog.ionelmc.ro
On Thu, Mar 19, 2015 at 6:17 PM, Chris Barker
On Thu, Mar 19, 2015 at 9:12 AM, Ionel Cristian Mărieș
wrote:
Worth considering, if you can afford it, to have a local patch that you apply before building. Then you have all the necessary fixes (like remove the setup_requires) in that patch file.
yup -- that's a option -- but a really painful one!
I did, in fact, find an incantation that works:
$PYTHON setup.py install --single-version-externally-managed --record=/tmp/record.txt
but boy, is that ugly, and hard to remember why not a --no-deps flag?
(and I have no idea what the --record thing is, or if it's even neccessary...
-Chris
This is a popular approach in Debian packages - they can have all kinds of
fixes for the upstream code.
Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro
--
Christopher Barker, Ph.D. Oceanographer
Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception
Chris.Barker@noaa.gov
On Wed, Mar 18, 2015 at 8:43 AM, Paul Moore
I suppose it's too late now, but the really painful parts of all this seem to be due to overly aggressive backward compatibility. We now have wheels, but also eggs, we now have pip, but also easy_install, etc.
Agreed. But the problem we have here is that any system that fails to work for even a tiny proportion of packages on PyPI is a major issue. And we don't have *any* control over those packages - if they do the most insane things in their setup.py, and don't release a new version using new tools, we have to support those insane things, or deal with the bug reports.
Maybe we should say "sorry, your package needs to change or we won't help"
Indeed -- I agree that it's key to support all the old kruft -- but it's key to support that with the package manger / installation tool, i.e. pip. We want pip install to "just work" for most of the packages already on PyPi for sure. But that doesn't mean that the newer-and-better setuptools needs to support all the same old cruft. If it were called something different: (distribute? ;-) ) then folks couldn't simply replace: from setuptool simport setup with from distribute import setup and be done, but they would only make that change if they wanted to make that change. Of course, then we'd be supporting both setuptools and distribute, and having to make sure that pip (and wheel) worked with both... so maybe just too much a maintenance headache, but breaking backward compatibility gets you a way forward that keeping it does not (py3 anyone?) I suppose the greater danger is that every feature in setuptools is there because someone wanted it -- so it would be easy for the "new" thing to grow all the same kruft....
that way (see, for example, the distribute or distutils2 flamewars).
IIRC, distribute was always imported as "setuptools" -- so born to create strife and/or accumulate all the same kruft. I guess I have no idea if there was a big problem with the architecture of setuptools requiring a big shift -- all I see are problems with the API and feature set.....and by definition you can't change those and be backward compatible...
Agreed entirely. It's a long slow process though to migrate away from the problems of setuptools without losing the great features at the same time...
That slog is MUCH longer and harder if you need to keep backward compatibility though. But I suppose the alternative is to build something no one uses! Is there any move to have a deprecation process for setuptools features? -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
On Thu, Mar 19, 2015 at 9:26 AM, Ionel Cristian Mărieș
The --record is for making a list of installed files. You don't need it if you don't use record.txt anywhere.
thanks -- I"ll take that out... This was a cut and paste form teh net after much frustration -- once I got somethign that worked, I decided I was done -- I had no energy for figuring out why it worked...
As for --single-version-externally-managed, that's unrelated to your setup_requires pain - you probably already have the eggs around, so they aren't redownloaded.
well, what conda does to build a package is create a whole new empty environment, then install the dependencies (itself, without pip or easy_install, or...), then runs setup.py install (for python packages anyway). In this case, that step failed, or got ugly, anyway, as setuptools didn't think the dependent packages were installed, so tried to install them itself -- maybe that's because the dependency wasn't installed as an egg? I can't recall at the moment whether that failed (I think so, but not sure why), but I certainly didn't want all those eggs re-installed.
What --single-version-externally-managed does is force the package to install in non-egg form (as distutils would).
hmm -- interesting -- this really was a dependency issue -- so it must change _something_ about how it looks for dependencies...
That also means only setup.py that uses setuptools will have the --single-version-externally-managed option available.
yup -- so I need to tack that on when needed, and can't just do it for all python packages... Thanks -- that does make things a bit more clear! -CHB
Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro
On Thu, Mar 19, 2015 at 6:17 PM, Chris Barker
wrote: On Thu, Mar 19, 2015 at 9:12 AM, Ionel Cristian Mărieș < contact@ionelmc.ro> wrote:
Worth considering, if you can afford it, to have a local patch that you apply before building. Then you have all the necessary fixes (like remove the setup_requires) in that patch file.
yup -- that's a option -- but a really painful one!
I did, in fact, find an incantation that works:
$PYTHON setup.py install --single-version-externally-managed --record=/tmp/record.txt
but boy, is that ugly, and hard to remember why not a --no-deps flag?
(and I have no idea what the --record thing is, or if it's even neccessary...
-Chris
This is a popular approach in Debian packages - they can have all kinds
of fixes for the upstream code.
Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro
--
Christopher Barker, Ph.D. Oceanographer
Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception
Chris.Barker@noaa.gov
-- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
On 19 March 2015 at 16:46, Chris Barker
I guess I have no idea if there was a big problem with the architecture of setuptools requiring a big shift -- all I see are problems with the API and feature set.....and by definition you can't change those and be backward compatible...
The ideal situation (at least in my mind) is to define a clear and straightforward command line API that pip uses to interface with the build system. So basically if we say that all pip needs to work is "python setup.py build" or whatever, then people can write any setup.py they like, use distutils, setuptools, bento, or even just a custom script. The hard bit is being completely clear on what the setup.py invocation is required to *do*. At the moment, it's "whatever setuptools/distutils does", which is why we end up relying on all sorts of obscure setuptools behaviours.
That slog is MUCH longer and harder if you need to keep backward compatibility though.
But I suppose the alternative is to build something no one uses!
Is there any move to have a deprecation process for setuptools features?
I have no idea, unfortunately. Setuptools has its own goals and plans, that I don't particularly follow. I get the impression that maintaining existing behaviour is a fairly high priority, though - there are a *lot* of projects that use all sorts of obscure parts of setuptools, so backward compatibility is probably pretty vital to them. Paul
The reason you should not have to run setup.py to dump out the
setup-requires is that, in the what-people-tend-to-expect definition,
setup.py cannot run without those requirements being installed first.
There is a similar problem with putting setup-requires in the PEP 426
Metadata. As long as we have setup.py we are going to have to generate
the PEP 426 metadata from setup.py itself for any legacy package or
any package where setup.py cannot be overhauled. One option would be
to have a partial PEP 426 metadata for just the setup-requires which
would be overwritten into a .dist-info directory by the data from
setup.py itself when that runs.
On Thu, Mar 19, 2015 at 12:56 PM, Chris Barker
On Thu, Mar 19, 2015 at 9:26 AM, Ionel Cristian Mărieș
wrote: The --record is for making a list of installed files. You don't need it if you don't use record.txt anywhere.
thanks -- I"ll take that out... This was a cut and paste form teh net after much frustration -- once I got somethign that worked, I decided I was done -- I had no energy for figuring out why it worked...
As for --single-version-externally-managed, that's unrelated to your setup_requires pain - you probably already have the eggs around, so they aren't redownloaded.
well, what conda does to build a package is create a whole new empty environment, then install the dependencies (itself, without pip or easy_install, or...), then runs setup.py install (for python packages anyway). In this case, that step failed, or got ugly, anyway, as setuptools didn't think the dependent packages were installed, so tried to install them itself -- maybe that's because the dependency wasn't installed as an egg?
I can't recall at the moment whether that failed (I think so, but not sure why), but I certainly didn't want all those eggs re-installed.
What --single-version-externally-managed does is force the package to install in non-egg form (as distutils would).
hmm -- interesting -- this really was a dependency issue -- so it must change _something_ about how it looks for dependencies...
That also means only setup.py that uses setuptools will have the --single-version-externally-managed option available.
yup -- so I need to tack that on when needed, and can't just do it for all python packages...
Thanks -- that does make things a bit more clear!
-CHB
Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro
On Thu, Mar 19, 2015 at 6:17 PM, Chris Barker
wrote: On Thu, Mar 19, 2015 at 9:12 AM, Ionel Cristian Mărieș
wrote: Worth considering, if you can afford it, to have a local patch that you apply before building. Then you have all the necessary fixes (like remove the setup_requires) in that patch file.
yup -- that's a option -- but a really painful one!
I did, in fact, find an incantation that works:
$PYTHON setup.py install --single-version-externally-managed --record=/tmp/record.txt
but boy, is that ugly, and hard to remember why not a --no-deps flag?
(and I have no idea what the --record thing is, or if it's even neccessary...
-Chris
This is a popular approach in Debian packages - they can have all kinds of fixes for the upstream code.
Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro
--
Christopher Barker, Ph.D. Oceanographer
Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception
Chris.Barker@noaa.gov
--
Christopher Barker, Ph.D. Oceanographer
Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception
Chris.Barker@noaa.gov
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
On 19 Mar 2015 23:33, "Leonardo Rochael Almeida"
On 18 March 2015 at 14:37, Daniel Holth
wrote: [...]
The behavior we're aiming for would be:
"installer run setup.py" - installs things "python setup.py" - does not install things
Besides that, I'd add that we're also looking for: "python setup.py" (by
itself) should not raise ImportError, even if setup.py needs extra things installed for certain operations (egg_info, build, sdist, develop, install).
IMO, the biggest pain point is not people putting crazy stuff in setup.py
to get version numbers. For me, the biggest pain point is when setup.py needs to import other packages in order to even know how to build:
So I'd like to suggest the following series of small improvements to both
pip and setuptools:
* setuptools: `python setup.py setup_requires` dumps its setup_requires
keyword in 'requirements.txt' format I believe setuptools can already do this (as "setup-requirements.txt"), but it's a generated file that people tend not to check into source control. Saying that file *should* be checked into source control (and teaching pip about it when looking for dependencies) might be a reasonable improvement - CPython certainly checks in several generated files to reduce the number of tools needed to build CPython in the typical case. Cheers, Nick.
On Thu, Mar 19, 2015 at 9:56 AM, Chris Barker
On Thu, Mar 19, 2015 at 9:26 AM, Ionel Cristian Mărieș
wrote:
The --record is for making a list of installed files. You don't need it if you don't use record.txt anywhere.
thanks -- I"ll take that out...
Actually, I took that out, and got: running install error: You must specify --record or --root when building system packages so it's needed I guess. By the way, the error I get if I do a raw setup.py install is: """ RuntimeError: Setuptools downloading is disabled in conda build. Be sure to add all dependencies in the meta.yaml url= https://pypi.python.org/simple/petulant-bear/r Command failed: /bin/bash -x -e /Users/chris.barker/PythonStuff/IOOS_packages/conda-recipes/wicken/build.sh """ so setuptools is trying to install petulant-bear, but conda has disables that. But it is, in fact installed, conda having done that to prepare the environment. So this is why I just want to tell setuptools to not try to download and install dependencies But we're getting off topic here -- should probably put in a feature request for "--no-deps" for install and build commands. -CHB
This was a cut and paste form teh net after much frustration -- once I got somethign that worked, I decided I was done -- I had no energy for figuring out why it worked...
As for --single-version-externally-managed, that's unrelated to your setup_requires pain - you probably already have the eggs around, so they aren't redownloaded.
well, what conda does to build a package is create a whole new empty environment, then install the dependencies (itself, without pip or easy_install, or...), then runs setup.py install (for python packages anyway). In this case, that step failed, or got ugly, anyway, as setuptools didn't think the dependent packages were installed, so tried to install them itself -- maybe that's because the dependency wasn't installed as an egg?
I can't recall at the moment whether that failed (I think so, but not sure why), but I certainly didn't want all those eggs re-installed.
What --single-version-externally-managed does is force the package to install in non-egg form (as distutils would).
hmm -- interesting -- this really was a dependency issue -- so it must change _something_ about how it looks for dependencies...
That also means only setup.py that uses setuptools will have the --single-version-externally-managed option available.
yup -- so I need to tack that on when needed, and can't just do it for all python packages...
Thanks -- that does make things a bit more clear!
-CHB
Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro
On Thu, Mar 19, 2015 at 6:17 PM, Chris Barker
wrote: On Thu, Mar 19, 2015 at 9:12 AM, Ionel Cristian Mărieș < contact@ionelmc.ro> wrote:
Worth considering, if you can afford it, to have a local patch that you apply before building. Then you have all the necessary fixes (like remove the setup_requires) in that patch file.
yup -- that's a option -- but a really painful one!
I did, in fact, find an incantation that works:
$PYTHON setup.py install --single-version-externally-managed --record=/tmp/record.txt
but boy, is that ugly, and hard to remember why not a --no-deps flag?
(and I have no idea what the --record thing is, or if it's even neccessary...
-Chris
This is a popular approach in Debian packages - they can have all kinds
of fixes for the upstream code.
Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro
--
Christopher Barker, Ph.D. Oceanographer
Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception
Chris.Barker@noaa.gov
--
Christopher Barker, Ph.D. Oceanographer
Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception
Chris.Barker@noaa.gov
-- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
On Thu, Mar 19, 2015 at 10:38 PM, Nick Coghlan
I believe setuptools can already do this (as "setup-requirements.txt"), but it's a generated file that people tend not to check into source control.
Isn't that just some project's convention - they just read it up in setup.py? Setuptools doesn't do anything with it by itself. Also, if pip were to support a setup-requirements.txt, should setuptools also support that natively? What about "repository url" dependencies? Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro
On 20 Mar 2015 09:07, "Ionel Cristian Mărieș"
On Thu, Mar 19, 2015 at 10:38 PM, Nick Coghlan
wrote: I believe setuptools can already do this (as "setup-requirements.txt"),
but it's a generated file that people tend not to check into source control.
Isn't that just some project's convention - they just read it up in
setup.py? Setuptools doesn't do anything with it by itself. I mean the setuptools feature that writes "setup_requires.txt" to the metadata directory so you can read it without running setup.py again. However looking at https://pythonhosted.org/setuptools/history.html shows that both times it has been added (8.4 and 12.4) the feature has had to be reverted due to breaking upgrades from earlier versions :( As long as setuptools lacks the ability to generate that file, I suspect this discussion will remain largely theoretical. Regards, Nick.
On 17 March 2015 at 15:36, Robert Collins
So, the propsed plan going forward: - now: - I will put a minimal patch up for pip into the tracker and ask for feedback here and there
Feedback solicited! https://github.com/pypa/pip/pull/2603 The week delay was refactoring the guts of prepare_files to make the patch small. Note too that the patch is probably got code in the wrong place etc - but its small enough to reason about :).
- we can debate at that point whether bits of it should be in setuptools or not etc etc - likewise we can debate the use of a temporary environment or install into the target environment at that point - future - in the metabuild thing that is planned long term, handling this particular option will be placed w/in the setuptools plugin for it, making this not something that needs to be a 'standard'.
-Rob
--
Robert Collins
participants (14)
-
Ben Finney
-
Chris Barker
-
Chris Barker - NOAA Federal
-
Chris McDonough
-
Daniel Holth
-
Donald Stufft
-
Ionel Cristian Mărieș
-
Leonardo Rochael Almeida
-
Marcus Smith
-
Nick Coghlan
-
Paul Moore
-
Robert Collins
-
Steve Dower
-
Wichert Akkerman