> On 15 Mar 2019, at 21:47, Simon Ruggier <simon80(a)gmail.com> wrote:
> The packages have to be available online for dependency links to work, yes, but they're not public: one needs to authenticate with an SSH key to clone each repository.
Both --find-links and --index-url (or --extra-index-url) provide the possibility to secure the download with authentication (HTTP Basic Auth).
>> On March 13, 2019 2:09:03 PM UTC, Tzu-ping Chung <uranusjr(a)gmail.com> wrote:
>>> On 6 Mar 2019, at 03:53, Simon <simon80(a)gmail.com> wrote:
>>> I hope it's not an issue that I'm replying to a month-old thread. I reviewed the previous discussion to try to avoid duplicating any of it.
>>> When using pip with PyPI, calling pip a second time is much quicker than the first time, because it verifies that the requirements, including version constraints, are satisfied in the target environment and doesn't needlessly reinstall stuff.
>>> Dependency links allowed the same behaviour to be implemented for private packages with dependencies on other private repositories: given a requirement B >= 3 and a dependency link that B was available from, pip could check if the environment already includes a package B with a new enough version, and only use the dependency link as a fallback if the requirement isn't already satisfied.
>>> URL specifiers aren't useful for providing a fallback location to get a package from, because using one prevents the package from specifying a version constraint in the same way that was possible with dependency links, or with normal requirements available from PyPI. Curiously, discussion of version constraints in this thread has focused on how nonsensical it would be to compare them to the specifying URL, ignoring the possibility of comparing the constraint with the target environment.
>>> The loss of this functionality means that anyone who was previously using pip to automatically install private packages with private dependencies now has to either forgo automatic dependency management (a large part of why one would use a package manager to begin with) in favour of recursively specified requirements files, publish their private packages somewhere so that pip can find them, or stick with pip 18.1 for now.
>> Wouldn’t you still need to “publish the private packages somewhere” for dependency links to work? `setup.py sdist` with `pip --find-links` can get you very far; the only differences IMO is you have to provide a proper package, and write a simple HTML file to point to it.
>>> Distutils-SIG mailing list -- distutils-sig(a)python.org
>>> To unsubscribe send an email to distutils-sig-leave(a)python.org
>>> Message archived at https://email@example.com/message/BALD…
It is very common for us to communicate via Github, Telegram or Whatsapp. Why not make Python a fully self-manageable web media? This would be my contribution to making Python the greatest language ever written on the world. What if we use Python with internal media? For example, if we need help - no servers ... we all connect ... with Twiiter, Telegram inside Python. In this case, it is client to client.
I hope it's not an issue that I'm replying to a month-old thread. I
reviewed the previous discussion to try to avoid duplicating any of it.
When using pip with PyPI, calling pip a second time is much quicker than
the first time, because it verifies that the requirements, including
version constraints, are satisfied in the target environment and doesn't
needlessly reinstall stuff.
Dependency links allowed the same behaviour to be implemented for private
packages with dependencies on other private repositories: given a
requirement B >= 3 and a dependency link that B was available from, pip
could check if the environment already includes a package B with a new
enough version, and only use the dependency link as a fallback if the
requirement isn't already satisfied.
URL specifiers aren't useful for providing a fallback location to get a
package from, because using one prevents the package from specifying a
version constraint in the same way that was possible with dependency links,
or with normal requirements available from PyPI. Curiously, discussion of
version constraints in this thread has focused on how nonsensical it would
be to compare them to the specifying URL, ignoring the possibility of
comparing the constraint with the target environment.
The loss of this functionality means that anyone who was previously using
pip to automatically install private packages with private dependencies now
has to either forgo automatic dependency management (a large part of why
one would use a package manager to begin with) in favour of recursively
specified requirements files, publish their private packages somewhere so
that pip can find them, or stick with pip 18.1 for now.
Hi, how are you?
I'm trying to create a thing, an idea, a thought. This is about the Pip package management system used to install and manage software packages written in the Python programming language.
So I wanted it to be automatic. Instead of always typing ...
pip install example.py
What if Python itself had this automatic feature internally?
impot example1 # runs pip install example.py
A. If you do not have the module, the module will be inserted.
B. If you have the module, check the installed version.
It would be a 'package.json, similar to the Nodejs' within Python.
I expect an answer
Thank you for your attention and respect,
I'm starting with Python and I have this idea
If the community helps and if I can help, it would be an honor for this idea, implementation ^^
I tried to get an answer on Github and asked to check it out here. If the discourse is good, we can include it in the Github repository for this new feature or implementation. I'm going to thank Mariatta for this tip.
Written message, reply
" Thanks for your interest in improving Python. However the devguide is not the right place to discuss the proposed idea. From the sound of it, perhaps distutils-sig? " Here I am...
The reference at the end is a link about it. It was closed in the python devguide and peps repositories in the issues area. I want more clarification about my thinking. I like to create and think. My little visual projects were thanks to what I found, vi, li, I researched in Stackoverflow, Github, Codepen.io.
I'm trying to translate the Arduino into Portuguese.
Library Brasilino ... ^^
And I'm adapting or suggesting the next words ... to be a bit like English .... It's that Portuguese has a graphic accent and English does not. For example, consider the word 'Sr não = If no' associated with this 'else'. So I'm reading dictionaries to find a word next to English ... without a graphic accent on what is considered in English.
Here https://github.com/OtacilioN/Brasilino and see in content 'issues' ^^
Sorry if I missed something obvious, but I have the following problem
while using setuptools.setup for packaging:
1. My LICENSE.md file is in the repository root as well as the setup.py
(because that is customary, and required so it shows up e.g. on GitHub
2. My package is inside src/wobblui/ because I like having it cleanly
separate in a subfolder
3. Because some of vendor code included that requires displaying of a
notice (BSD/MIT license) and this is a UI library, I would like to offer
a function that returns the text to display to make it easier for people
to implement it. This function should be part of the package. This
REQUIRES the contents of LICENSE.md, obviously
4. To complicate things, I have no second copy of LICENSE.md inside
src/wobblui/ because then I would have to keep two flies in sync
This has left me with a situation where I am really unsure what to do:
I need LICENSE.md installed into site-packages inside the module folder
to make this work (and it's not a huge file so that seemed like a
reasonable solution), so I tried to use ../ in a package_data directive
since the source file is outside of src/wobblui/. However, that seems to
be completely ignored. data_files works, but of course is relative to a
way more upwards location and I have no clue how to specify the exact
path where the module/package actually was installed to, and it doesn't
seem to be INTENDED for this purpose either.
So how exactly am I supposed to package a file that is outside of the
package folder like that, but should still be installed into the
> -----Original Message-----
> From: Chad Smith <chadsmith27(a)gmail.com>
> Sent: Friday, March 1, 2019 1:33 PM
> To: Alex Walters <tritium-list(a)sdamon.com>
> Subject: Re: [Distutils] Re: PEP-582 concerns
> I was not aware that discuss.python.org <http://discuss.python.org> was
> being used now. I started a thread there on PEP-582:
> Would you mind reposting your response there?
Yes, I actually would greatly mind posting on discuss.python.org, that site is itself contentious. Discussion mediums have been discussed to death, I won't rehash it here (nor do I want to open that issue back up), but I posted on the list for a reason.
> On Fri, Mar 1, 2019 at 2:30 AM Alex Walters <tritium-list(a)sdamon.com
> <mailto:firstname.lastname@example.org> > wrote:
> I kind of feel that "third party tool can/will use this feature" is
> orthogonal to "how the interpreter behaves out of the box" - unless I
> misunderstand and you are suggesting python grow support for
> entrypoints from the python executable.
> > -----Original Message-----
> > From: chadsmith27(a)gmail.com <mailto:email@example.com>
> <chadsmith27(a)gmail.com <mailto:firstname.lastname@example.org> >
> > Sent: Thursday, February 28, 2019 12:42 PM
> > To: distutils-sig(a)python.org <mailto:email@example.com>
> > Subject: [Distutils] Re: PEP-582 concerns
> > Running entrypoints/executables in the bin directory has been
> solved by
> > node community with npx
> > > Executes <command> either from a local node_modules/.bin, or
> from a
> > central cache, installing any packages needed in order for
> <command> to
> > run.
> > I built similuar support into pipx as well, so `pipx run ENTRYPOINT`
> > the appropriate `__pypackages__` path for the bin dir and the
> > This is not the only way to solve it, but it seems to be working for
> > Indeed, massively popular projects like `create-react-app`
> > (https://github.com/facebook/create-react-app#quick-overview)
> > npx as their sole installation instructions.
> > --
> > Distutils-SIG mailing list -- distutils-sig(a)python.org <mailto:distutils-
> > To unsubscribe send an email to distutils-sig-leave(a)python.org
> > https://mail.python.org/mailman3/lists/distutils-sig.python.org/
> > Message archived at https://mail.python.org/archives/list/distutils-
> The Maintainers Summit will take place on the morning of Saturday, May 4th, the first day of PyCon proper. A part of PyCon’s hatchery program, the Summit is seeking to build a community of practice for project maintainers and key contributors. We seek to help the Python community sustain and grow healthy projects and communities.
> Activities will include talks and mini unconference around technology, community, resourcing, and more as it relates to package maintenance.
The call for proposals for lightning talks https://www.papercall.io/pycon-maintainers-summit is open till March 15th. But even if you don't want to propose a talk, I suggest you keep the Summit on your radar -- one of the topic suggestions is:
> What tools and techniques have helped you maintain your Python project, and what challenges have yet to be addressed? Potential topics: CI/CD tooling, package layout, setuptools, PyPI, licenses compatibility, process and release tooling
And then we could take some of those topics into the sprints.
Regarding fork versioning, I found this thread:
Now, this is relating to forks that incorporate small changes, like bug fixes, and that are meant to be integrated upstream.
When the fork implies greater changes, and is meant to be independent from the original package and have a different name, how should versioning work?
From the following discussion, I get that probably the best method is to start as a fully independent distribution, for instance from version 1.0.
So, let’s assume I fork a package which is on version 2.1. and change its name.
Would my distribution version be 1.0 or 2.2/3.0? Or none of the options?
I have 2 main concerns about PEP 582 that might just be me misunderstanding
My first concern is the use of CWD, and prepending ./_pypackages_ for
scripts. For example, if you were in a directory with a _pypackages_
subdirectory, and had installed the module "super.important.module". My
understanding is that any scripts you run will have "super.important.module"
available to it before the system site-packages directory. Say you also run
"/usr/bin/an_apt-ly_named_python_script" that uses "super.important.module"
(and there is no _pypackages_ subdirectory in /usr/bin). You would be
In this case, this adds no more version isolation than "pip install --user",
and adds to the confoundment factor for a new user. If this is a
misunderstanding of the pep (which it very well might be!), then ignore that
concern. If it's not a misunderstanding, I think that should be emphasized
in the docs, and perhaps the pep.
My second concern is a little more... political.
This pep does not attempt to cover all the use-cases of virtualenvs - which
is understandable. However, this also means that we have to teach new users
*both* right away in order to get them up and running, and teach them the
complexities of both, and when to use one over the other. Instead of making
it easier for the new user, this pep makes it harder. This also couldn't
have come at a worse time with the growing use of pipenv which provides a
fully third way of thinking about application dependencies (yes, pipenv uses
virtualenvs under the hood, but it is a functionally different theory of
operation from a user standpoint compared to traditional pip/virtualenv or
Is it really a good idea to do this pep at this time?
In a vacuum, I like this pep. Aside from the (possible) issue of unexpected
shadowing, it's clean and straight forward. It's easy to teach. But it
doesn't exist in a vacuum, and we have to teach the methods it is intended
to simplify anyways, and it exists in competition with other solutions.
I am not a professional teacher; I don't run python training courses. I do,
however, volunteer quite a bit of time on the freenode channel. I get that
the audience there is self-selecting to those who want to donate their time,
and those who are having a problem (sometimes, those are the same people).
This is the kind of thing that generates a lot of confusion and frustration
to the new users I interact with there.