New submission from Salvo Tomaselli:
I have a python thing that I'm installing in a custom path, using
setup.py install --root=/tmp/turi13 --install-purelib=custom/path
However, if the setup.py file contains a data_files field, then, unless i also specify --install-data, the --install-purelib parameter gets ignored.
I think an error should be generated and the entire thing should fail instead. Same for all of those --install-* parameters.
title: Install path changes when data_files is present
Setuptools tracker <setuptools(a)bugs.python.org>
I have a question that I haven't been able to find online or in the python
While trying to install a module I have encountered significant trouble,
stalling my current experiment.
I recently installed python 3.6.2, and I want to know how to transfer the
numerous modules I have on 3.6.1 to the newest version, usable by an
Since I was not able to transfer packages to the newer version, I tried to
install my module to the working python 3.6.1 version, but similarly, could
not find any documentation about how to do this.
I also tried to completely uninstall the python 3.6.2 build, but after
running sudo-remove commands, and then testing by booting python3, the
build running is still 3.6.2, even after removing the framework etc.
I am sincerely confused by this and it has unfortunately brought my
experiment to a standstill. If you could offer any illumination on this, I
would be extremely grateful.
I truly thank you in advance for any help you can provide.
I've seem people claiming packages on the Support Requests
<https://sourceforge.net/p/pypi/support-requests/?source=navbar> page on
source forge, I've also done so myself. However, most of these request
seems to go unnoticed. The PEP 541 does not seem to specify any way
actually performing claims to transfer ownership of packages. In short:
What is the procedure to claim a project on PyPI?
After reviewing the PEP, I personally feel that it relies too much on the backend Doing The Right Thing. As it currently stands, it is my understanding that the build backend is called in the source directory and is then responsible for handling the entire process to build a compliant wheel. In a PEP 517 world, the build backends may be poorly written initially and may not fully comply with the PEP’s goal to produce the same wheel that would be produced from an sdist.
- Frontend calls backend in source directory build-wheel
- Frontend copies source tree to temporary directory
- Frontend invokes build-sdist to build an sdist
- Frontend extracts sdist to new temporary directory
- Frontend reloads backend from sdist directory and invokes build-wheel
The proposed process is more computationally intensive, but moves compliance logic out of the build backend. In addition, we can make some modifications based on this proposed process:
- Remove parameter for build_dir in build_wheel since it is always the current directory
- Optimization: add new parameter to build_sdist called copy_extra, that when set to false, only includes files in sdist needed to build a wheel.
- Specify that build_wheel should fail if PKG-INFO is not present so that it is always invoked in an sdist
I realized that I may have pushed to have this completed earlier, but I agree with Donald that we need to enforce a process rather than hoping for the best.
I'm trying to create multiple C++ extensions that have dependencies between
This works fine on Linux with gcc, but I get link failures on Windows with
MSVC due to this
"*IMPLIB*" is an optional argument, when not specified, the generation of
exp/lib files default to the same filename and directory as the dll.
If precompiled extensions are to be published with MSVC and expected to be
linked against (maybe this is not recommended?) the lib files are required
so it seems a bit peculiar to me to not have them with the dll as is the
Proof of concept
I was interested in working on the following issue but thought it
would be worth asking this list about first:
If pip-installing a dependency from a VCS url of the following form:
and "sometag" is the hash of a commit, should pip interpret it as a
commit hash, or should it first check for a branch or tag of that
One argument in favor of interpreting it as a commit hash is greater
determinism. If someone later adds a branch or tag with that name, the
dependency will continue to resolve to the original commit. (However,
someone could still remove the original commit before adding a branch
or tag with that name.) Another argument is that, without this
behavior, once a branch or tag with that name is added, there will no
longer be a way to refer to the original commit.
Either way, I think the log message can be improved. Every time I see
that log message, I think to myself, "of course it's a commit, it's a
40-character hexadecimal string!"
I just had a dreadful day dealing with PyPI and the migrations that are
happening. I'm posting this rant here in hopes of reaching the people who
can fix it (apologies if I'm in the wrong list), or at least offer an
explanation of the events that happened.
- Any and all feedback is welcome, but please be charitable.
- I'm angry *at* things, I hope not to offend anybody.
- I'm posting this rant here and not reddit or hn to ensure the discussion
stays between the interested parties and not the whole internet.
For formatting reasons the text is here:
Thank you for your attention
I have been attempting to resolve the horrible easy_install behavior where compiled wheels published on pypi are not used to fulfill setup requirements. To address this issue, I have submitted pull requests to both setuptools and pip that would allow a proprietary communication method that can later be easily adjusted to be fully PEP 517 compliant, saving work while resolving a specific issue.
However, everyone appears to be ignoring these pull requests, in my estimation, because PEP 517 is not fully completed. That’s fine, but partial PEP 517 compliance in one area is better than nothing at all, which is currently the default scenario.
So looking over the big threads from the last week, I *think* I have a
reasonable sense of what different people think about the different
trade-offs here... but I'm not sure. And I'm guessing I'm not the only
one feeling uncertain here.
So in this post I'm not going to argue for anything at all! It's
purely for information gathering, to try and stop, take stock, and
figure out where everyone actually is at this point.
To make this concrete: I'm *pretty* sure (?) that at this point all
the basic elements in my "simplified" rewrite are things that we now
have consensus are needed in some form, so maybe we can use that as a
kind of "minimal core" reference point:
(Again, I'm NOT arguing for this right now, I just figure there's been
enough stuff flying by that it's useful to have a concrete reference
And I'd really appreciate hearing answers to these questions, from at
least Daniel, Donald, Nick, Paul, Thomas, and of course anyone else
who wants to speak up:
- Is there anything missing from this "minimal core" that *you or your
- Is there anything missing that you think will cause problems for *others*?
- Is there anything in there that you feel adds unnecessary complexity?
- Is there anything missing that you think is definitely something
we'll want eventually? Is it something where writing a small dedicated
followup PEP would produce a materially worse outcome than including
it from the start?
PS: just for the record, the link above is almost identical to the
version I sent before; main change is some TODO notes added at the
end. Diff view:
Nathaniel J. Smith -- https://vorpus.org