Notes from python core sprint on workflow tooling
Now that the basic wheels/pip/PyPI infrastructure is mostly functional, there's been a lot of interest in improving higher-level project workflow. We have a lot of powerful tools for this – virtualenv, pyenv, conda, tox, pipenv, poetry, ... – and more in development, like PEP 582 [1], which adds a support for project-local packages directories (`__pypackages__/`) directly to the interpreter. But to me it feels like right now, Python workflow tools are like the blind men and the elephant [2]. Each group sees one part of the problem, and so we end up with one set of people building legs, another a trunk, a third some ears... and there's no overall plan for how they can fit together. For example, PEP 582 is trying to solve the problem that virtualenv is really hard to use for beginners just starting out [3]. This is a serious problem! But I don't want a solution that *only* works for beginners starting out, so that once they get a little more sophisticated they have to throw it out and learn something new from scratch. So I think now might be a time for a bit of top-down design. **I want a picture of the elephant.** If we had that, maybe we could see how all these different ideas could be put together into a coherent whole. So at the Python core sprint a few weeks ago, I dragged some interested parties [4] into a room with a whiteboard [5], and we made a start at it. And now I'm writing it up to share with you all. This is very much a draft, intended as a seed for discussion, not a conclusion. [1] https://www.python.org/dev/peps/pep-0582/ [2] https://en.wikipedia.org/wiki/Blind_men_and_an_elephant [3] https://www.python.org/dev/peps/pep-0582/#motivation [4] I won't try to list names, because I know I'll forget someone, and I don't know if everyone would agree with everything I wrote there. But thank you all! [5] https://photos.app.goo.gl/4HfY8P3ESPNi9oLMA, including special guest appearance by Kushal's elbow # The idealized lifecycle of a Python project ## 1. Beginner Everyone starts out as a rank beginner. This may be the first time they have programmed at all. At this stage, users want to: - install *one* thing to get started (e.g. python itself) - write and run simple scripts (standalone .py files) - run a REPL - install and use PyPI packages like requests or numpy - install and use tools like jupyter - their IDE should also be able to find these packages/tools Over time, they'll probably end up with multiple scripts, and maybe want to organize them into subdirectories. The above should all work from subdirectories. ## 2. Sharing with others Now we have a neat little script. Or maybe we've made a pretty jupyter notebook that computes some crucial business analytics. We want to share it with our friends or coworkers. We still need the features above; and now we also care about: - version control - some way for our friend to reconstruct, on their computer: - the same PyPI packages that we were using - the same tools that we were using - the ways we invoked those tools This last point is important: as projects grow in complexity, and are used by a wider audience, they often end up with fairly complex tool specifications that have to be shared among a team. For example: - to run tests: in an environment that has pytest, pytest-cov, and pytest-trio installed, and with our project working directory on PYTHONPATH, run `pytest -Werror --cov ...` - to format code: in an environment using python 3.6 or later, that has black installed, run `black -l 79 *.py my-util-directory/*.py` This kind of tool specification also puts us in a good position to set up CI when we reach that point. At this point our project can grow in a few different directions. ## 3a. Deployable webapp This adds the requirement to "deploy". I think this is mostly covered by the set-up-an-environment-to-run-a-command functionality already described? I'm not super familiar with this, but it's pipenv's core target, and pipenv doesn't have much more than that, so I assume that's about right... ## 3b. Reusable library For this we also need to: - Build sdists and wheels - Which means: pyproject.toml, and some way to invoke it - Install our library into our environments - Including dependency locking (best practice is to not pin dependencies in wheel metadata, but to pin all dependencies in CI; so there needs to be some way to track those separately, but integrated enough that it's not a huge ceremony to add or change a dependency) ## 3c. Reusable standalone app I think this is pretty much like the "Reusable library", except that it'd be nice to have better tools to build/distribute standalone applications. But if we had them, we could invoke them the same way as we invoke other build systems? # How do existing tools/proposals fit into this picture? pyenv, virtualenv, and conda all solve parts of the "create an environment" problem, but consider the other aspects out-of-scope. tox solves the problem of keeping a shared record of how to run a bunch of different tools in the appropriate environments, but doesn't handle pinning or procuring appropriate python versions, and requires a separate bootstrapping step to install tox. `__pypackages__` (if implemented) makes it very easy for beginners to use PyPI packages in their own scripts and from the REPL; in particular, it would be part of python, so it meets the "install *one* thing" criterion. But, it doesn't provide any way to run tools. (There's no way to put `__pypackages__/bin` on PATH.) It doesn't allow scripts to be organized into subdirectories. (For security reasons, we can't have the python interpreter going off walking the filesystem looking for `__pypackages__/`, so the PEP specifies that `__pypackages__/` has to be in the same directory as the script that uses it.) There's no way to share your `__pypackages__` environment with a friend. So... it seems like a something that people would outgrow very quickly. pipenv and poetry are interesting. Their basic strategy is to say, there is a top-level command that acts as your entry point to performing workflow actions on on a python project (`pipenv` or `poetry`, respectively). And this strategy at least in principle can solve the problems that `__pypackages__/` runs into. In particular, it doesn't rely on `$PATH`, so it can run tools; and because it's a dedicated project management tool, it can go looking for the project marker file. # A fantastic elephant So if our idealized user had an idealized tool, what would that look like? They'll be interacting with Python through a dedicated tool, similar to pipenv or poetry. In my little fantasy here I'll call it `pyp`, because (a) I want to be neutral, (b) 6 characters is too long. To get this tool, either they install Python (via python.org download, apt, homebrew, whatever), and the tool is automatically included. Or else, they install the tool directly, and it has the ability to install Python interpreters when needed. Once they have the tool, they start by making a new directory for their project (this way they're ready to switch to version control later). Then they somehow mark this directory as being a "python project root". I guess the UI would be something like `pyp new <name>` and it just does it for you, but we have to figure out what this creates on disk. We need some sort of marker file. Files that currently serve this kind of role include tox.ini, Pipfile, pyproject.toml, __pypackages__, ... But only one of these is a standard thing we're already committed to sticking with, so, pyproject.toml it is. Let's make it the marker for any python project, not just redistributable libraries. (And if we do grow up into a redistributable library, then we're already prepared.) In the initial default configuration, there's a single default environment. You can install things with `pyp install ...` or `pyp uninstall ...`, and it tracks the requested packages in some standardized way in pyproject.toml, and also pins specific versions somewhere (could be pyproject.toml again I guess, or poetry's pyproject.lock would work too). This way when we decide to share our project later, our friends can recreate our environment on their system. However, there's also the capability to configure multiple custom execution environments, including python version and installed packages. And the capability to configure new aliases like `pyp test` or `pyp reformat`, which run some specified command in a specified environment. Since the install/locking metadata is all standardized, you can even switch between competing tools, and integrate with third-party tools like pyup.io. For redistributable libraries, we also need some way to get the wheel metadata and the workflow metadata to play nicely together. Maybe this means that we need a standardized install-requires field in pyproject.toml, so that build backends and workflow tools have a shared source of truth? # What's wrong with pipenv? Since pipenv is the tool that those of us in the room were most familiar with, that comes closest to matching this vision, we brainstormed a list of complaints about it. Some of these are more reasonable than others. - Not ambitious enough. This is a fuzzy sort of thing, but perception matters, and it's right there in the name: it's a tool to use pip, to manage an environment. If we're reconceiving this as the grand unified entryway to all of Python, then the name starts to feel pretty weird. The whole thing where it's only intended to work for webapp-style projects would have to change. - Uses Pipfile as a project marker instead of pyproject.toml. - Not shipped with Python. (Obviously not pipenv's fault, but nonetheless.) - Environments should be stored in project directory, not off in $HOME somewhere. (Not sure what this is about, but some of the folks present were quite insistent.) - Environments should be relocatable. - Hardcoded to only support "default" and "dev" environments, which is insufficient. - No mechanism for sharing prespecified commands like "run tests" or "reformat". - Can't install Python. (There's... really no reason we *couldn't* distribute pre-built Python interpreters on PyPI? between the python.org installers and the manylinux image, we're already building redistributable run-anywhere binaries for the most popular platforms on every Python release; we just aren't zipping them up and putting them on PyPI.) -n -- Nathaniel J. Smith -- https://vorpus.org
On Sun, 30 Sep 2018 at 11:48, Nathaniel Smith <njs@pobox.com> wrote:
Now that the basic wheels/pip/PyPI infrastructure is mostly functional, there's been a lot of interest in improving higher-level project workflow.
[...]
This is very much a draft, intended as a seed for discussion, not a conclusion. [...]
Interesting ideas - thanks for writing them up, I wish I could have been there! One workflow that I see a lot (and I *use* a lot myself!) is the "directory full of stuff" pattern. Describing it that way makes it sound disorganised, but it often really isn't. Essentially, it's where the user has a directory (maybe something as generic as "Work", maybe something like "Customer 1" or "Research on topic X") which contains a lot of scripts, some in Python, some not, plus support programs, data, working notes, etc. It's emphatically *not* suitable for checking into VCS, or bundling up, but it does potentially require things like dependency management. At the moment, when using such a workflow, it's more or less essential to either use nothing but the Python stdlib, remember details like "I need to activate virtualenv X to run this script", or install dependencies into the system Python (or maybe user site). Sharing scripts like this (unless they take the "stdlib only" approach) is really difficult. It's very easy to dismiss this sort of approach as something we don't want to (or can't) support, but it's very much the way beginners (and non-beginner part time users) approach *any* language. And when it fails for Python, they see that as a failing of Python, not of the approach. In my workplace, this sort of approach is standard for shell scripts, batch files, SQL scripts, etc. The fact that it doesn't work for non-trivial Python scripts makes it nearly impossible for me to promote Python as an automation solution (I usually end up suggesting Groovy instead, because Java is everywhere, and JVM code can be bundled up with its dependencies into a standalone jar). Essentially, this is your "Beginner" level (or more accurately "Sharing with others" but without infrastructure like VCS), but I think that term ignores just how far some organisations push that model - way beyond anything that an *actual* "beginner" would use. It's not impossible to argue that doing so isn't a model that we want to support, but in doing that we're pretty much abandoning the idea of Python for "adhoc scripting" and I don't think that's a good idea. As someone very much in that situation, I really need some sort of solution. But I've tried pipenv and similar, and uniformly their assumptions on how I structure my work, and what flexibility I have in my environment, are wrong making them useless for me. At the moment, I'm not sure what else to add to your summary, apart from "yes, you're right". Getting the various projects in this area talking and sharing ideas/resources would be great.
- Environments should be stored in project directory, not off in $HOME somewhere. (Not sure what this is about, but some of the folks present were quite insistent.)
For me, it's about being able to copy/relocate a project, and about housekeeping. If I rename my project directory, I'd rather not have to run a rebuild step (no matter how easy that is). Also, I'd rather not have to remember that I once had a directory called "secret_project" and hunt out and remove the clutter in $HOME that's linked to it, now that I've officially named the project "pip2" <wink>. It's not fundamental to "having stuff in $HOME" so much as how pipenv does (or doesn't) maintain a record of where all the bits that make up a "project" are kept. At least to me (and I'd have been insistent if I were present). The problem with high-level management tools for workflow (and especially opinionated ones) is that unless you're very careful to survey people's requirements and specify your scope, you're always going to end up with people who need to do certain things *not* being served by your tool. So it's almost impossible to be "the one official tool". Paul
On Sun, Sep 30, 2018 at 4:30 AM Paul Moore <p.f.moore@gmail.com> wrote:
On Sun, 30 Sep 2018 at 11:48, Nathaniel Smith <njs@pobox.com> wrote:
Now that the basic wheels/pip/PyPI infrastructure is mostly functional, there's been a lot of interest in improving higher-level project workflow.
[...]
This is very much a draft, intended as a seed for discussion, not a conclusion. [...]
The problem with high-level management tools for workflow (and especially opinionated ones) is that unless you're very careful to survey people's requirements and specify your scope, you're always going to end up with people who need to do certain things *not* being served by your tool. So it's almost impossible to be "the one official tool".
Nathaniel, thanks for starting this discussion. I like how you're stepping back and questioning old assumptions, etc. I share Paul's concern a bit re: "one tool." As soon as a hypothetical tool is released, it becomes saddled with backwards compatibility guarantees, which prevents things from being fixed as you learn more. This is related to Guido's(?) saying about how putting something in the standard library is like putting one of its feet in the grave (the elephant's foot?). Some questions related to your ideas: is the "elephant" one tool, or more abstractly one set of specifications, or simply a recommended workflow (e.g. for the 80%)? I think it would be good if a tool and / or specifications are flexible enough so they can be adapted to use cases that we might not have written down. Or is the tool explicitly not trying to be useful in all use cases? Before, I thought PyPA's approach was to slowly create more standards (e.g. a standard for the leg, the trunk, etc) which would let others create and innovate a multitude of tools, as opposed to a top-down approach of thinking of one tool first. Is that standards approach not working out, or is this just something to start doing in parallel to supplement that? To give you an idea, here's one example of a trickier workflow / use case I've found. Say you're developing locally an application that you run inside a number of Docker containers (even when developing), and you want to be able to sync your code changes in realtime into Docker while the application is running. Also, you might get most of your dependencies from PyPI, but occasionally, you also want to swap in forks of dependencies, that you can similarly edit while developing (e.g. like editable installs). It can be challenging to get stuff like this working if the tools you're using make too many directory or workflow assumptions. However, a very powerful or flexible tool (e.g. Git), or a collection of several tools that each does one thing well, can often work well in unanticipated situations. (However, neither of those options strikes me as being friendly to beginners, which might be the primary thing you're trying to solve -- I'm not sure.) --Chris
On Sun, 30 Sep 2018 at 13:26, Chris Jerdonek <chris.jerdonek@gmail.com> wrote:
It can be challenging to get stuff like this working if the tools you're using make too many directory or workflow assumptions. However, a very powerful or flexible tool (e.g. Git), or a collection of several tools that each does one thing well, can often work well in unanticipated situations. (However, neither of those options strikes me as being friendly to beginners, which might be the primary thing you're trying to solve -- I'm not sure.)
My attention span is collapsing to nothing right now, so I'll just comment on this one small point: The question of one unified tool vs a toolkit of capabilities (whether mini-tools like Unix commands, or subcommands of a big tool like git) is an important one. Like you say the toolkit approach is fundamentally more flexible, but not beginner-friendly. Personally, I think that the toolkit approach (standards, interop, low level support) is where distutils-sig and PyPA works best. Higher level unifications ("one tool to rule them all") have historically been much less successful. I'm not sure PyPA should be defining best practices like workflows, or promoting tools tied to particular workflows. But I'm open to persuasion - if something sufficiently flexible (i.e., that satisfies *my* needs, on a selfish level ;-)) gains wide community support, then I'm fine with that. Paul
On Sun, Sep 30, 2018, at 2:35 PM, Paul Moore wrote:
Personally, I think that the toolkit approach (standards, interop, low level support) is where distutils-sig and PyPA works best. Higher level unifications ("one tool to rule them all") have historically been much less successful.
I suspect that 'one tool' might be beyond our grasp, but I don't think Nathaniel is actually proposing that we try to make one tool. Thinking about what 'one tool' might look like might help us clarify where the existing tools overlap, have gaps, or don't fit well together. Another way to approach this might be to consider what tools exist in other languages, and what people do and don't like about them. I have used project management tools for Rust (cargo), Ruby (bundler) and Javascript (npm/bower), albeit only a little in each case. All of those tools default to putting dependencies somewhere within your project directory. Maybe that's fundamentally better (although it's also possible that people with experience of those tools learn to expect that even if there are good reasons to do something different ;-). I don't have many thoughts at the moment, but I'll turn this around in my head a bit. Thomas
I read and mostly agree with Chris and Paul as we operate in similar spaces and probably have similar experiences with trying to unify packaging related tooling (its hard, we are all currently trying to undo this). Without getting too focused on the details, despite the technical and implementation challenges, I _also_ am super interested in the user experience (this is why I am working on pipenv in the first place). Pipenv does have a number of issues currently, many of them are technical and a lot of them are being worked through (mostly by breaking things apart into smaller libraries that are not actually part of pipenv). Since you mentioned it specifically and there were a few minor misconceptions I just figured I'll speak to those for now:
Not ambitious enough. This is a fuzzy sort of thing, but perception matters, and it's right there in the name: it's a tool to use pip, to manage an environment. If we're reconceiving this as the grand unified entryway to all of Python, then the name starts to feel pretty weird.
Well not sure why the name is weird, pip (current packaging tooling) + env (environments) = pipenv. It doesn't build libraries for you and we've been pretty adamant about that point because we feel it's really not a good design to intermingle the construction of apps with the construction of libraries. I'm all for making users lives better; most users don't have a starting experience of wanting to package something up and ship it to pypi but nearly all users make environments and install packages all the time. It seems like it might be a mistake to start people off by confusing them about which things are applications and which things are libraries, how they work, etc
Uses Pipfile as a project marker instead of pyproject.toml.
See above. pyproject.toml wasn't standardized yet when pipenv was released (and still isn't, beyond that it is a file that could exist and store information). Pipfile was intended to replace requirements.txt per some previous thread on the topic, and pipenv was an experimental implementation of the separation between the two different ways that people currently use requirements.txt in the wild -- one as a kind of abstract, unpinned dependency list (Pipfile), and the other as a transitive closure (Pipfile.lock). Since neither is standardized _for applications_, I'm not totally sure this is an actual sticking point. In either case, this seems super minor...
Not shipped with Python. (Obviously not pipenv's fault, but nonetheless.) (Not sure it should be)
Environments should be stored in project directory, not off in $HOME somewhere. (Not sure what this is about, but some of the folks present were quite insistent.)
They used to be by default stored in $PROJECT/.venv but user feedback led us to use $WORKON_HOME by default. This is configurable by environment variable ($PIPENV_VENV_IN_PROJECT) or if you simply have a virtualenv in the .venv folder in your project directory.
Environments should be relocatable.
And that will be possible whenever we can use venv across platforms and python versions. Currently that isn't possible, and we are forced to use virtualenv for compatibility.
Hardcoded to only support "default" and "dev" environments, which is insufficient.
Why? I mean, if you are planning to replace setuptools / flit / other build systems with pipenv and use pipfile as your new specification for declaring extras, I guess, but that's not how it's designed currently. Beyond that, I think we need some actual information on this one -- adding more complexity to any tool (including this kind of complexity) is going to ask more of the user in terms of frontloaded knowledge. This constraint limits the space a bit and for applications, I've very rarely seen actual limitations of this setup (but am interested, we have obviously had this feedback before but are not eager to add knobs in this specific area).
No mechanism for sharing prespecified commands like "run tests" or "reformat".
There is, but the documentation on the topic is not very thorough: https://pipenv.readthedocs.io/en/latest/advanced/#custom-script-shortcuts See also: https://github.com/sarugaku/requirementslib/blob/master/Pipfile#L26 For an example for the specific cases you mentioned, the Pipfile entry in that project looks like this: [scripts] black = 'black src/requirementslib/ --exclude "/(\.git|\.hg|\.mypy_cache|\.tox|\.venv|_build|buck-out|build|dist)/"' tests = "pytest -v --ignore=src/requirementslib/_vendor/ tests"
Can't install Python. (There's... really no reason we *couldn't* distribute pre-built Python interpreters on PyPI? between the python.org installers and the manylinux image, we're already building redistributable run-anywhere binaries for the most popular platforms on every Python release; we just aren't zipping them up and putting them on PyPI.)
Erm, well actually you can install python currently via pyenv on linux, and Tzu-ping is the maintainer of a pyenv-clone on windows which we've just never really got around to integrating fully. I've spoken to some of the folks over at Anaconda and I know they are interested in this as well especially given that it's pretty straightforward. It hasn't been a primary focus lately, but the tooling does exist (I don't think I've ever used it personally though) Anyway, this is all a good discussion to have and I really appreciate you kicking it off. I've been following the __pypackages__ conversation a bit since pycon and I honestly don't have much opinion about where we want to put stuff, but I'm not sure that the impact of the folder is going to be as great to the user as people might imagine -- the tooling is already being built, so maybe it's just a matter of agreeing on that as the place to put stuff, which schema to follow, and honestly working with some new users. I do this quite a bit but I haven't done any formal information gathering. Anecdotally I'll always tell you I'm right, but if we had some user data on specific pain points / usability issues I'd definitely be prepared to change my mind. Dan Ryan gh: @techalchemy // e: dan@danryan.co
-----Original Message----- From: Nathaniel Smith [mailto:njs@pobox.com] Sent: Sunday, September 30, 2018 6:42 AM To: distutils-sig Subject: [Distutils] Notes from python core sprint on workflow tooling
Now that the basic wheels/pip/PyPI infrastructure is mostly functional, there's been a lot of interest in improving higher-level project workflow. We have a lot of powerful tools for this – virtualenv, pyenv, conda, tox, pipenv, poetry, ... – and more in development, like PEP 582 [1], which adds a support for project-local packages directories (`__pypackages__/`) directly to the interpreter.
But to me it feels like right now, Python workflow tools are like the blind men and the elephant [2]. Each group sees one part of the problem, and so we end up with one set of people building legs, another a trunk, a third some ears... and there's no overall plan for how they can fit together.
For example, PEP 582 is trying to solve the problem that virtualenv is really hard to use for beginners just starting out [3]. This is a serious problem! But I don't want a solution that *only* works for beginners starting out, so that once they get a little more sophisticated they have to throw it out and learn something new from scratch.
So I think now might be a time for a bit of top-down design. **I want a picture of the elephant.** If we had that, maybe we could see how all these different ideas could be put together into a coherent whole. So at the Python core sprint a few weeks ago, I dragged some interested parties [4] into a room with a whiteboard [5], and we made a start at it. And now I'm writing it up to share with you all.
This is very much a draft, intended as a seed for discussion, not a conclusion.
[1] https://www.python.org/dev/peps/pep-0582/ [2] https://en.wikipedia.org/wiki/Blind_men_and_an_elephant [3] https://www.python.org/dev/peps/pep-0582/#motivation [4] I won't try to list names, because I know I'll forget someone, and I don't know if everyone would agree with everything I wrote there. But thank you all! [5] https://photos.app.goo.gl/4HfY8P3ESPNi9oLMA, including special guest appearance by Kushal's elbow
# The idealized lifecycle of a Python project
## 1. Beginner
Everyone starts out as a rank beginner. This may be the first time they have programmed at all. At this stage, users want to:
- install *one* thing to get started (e.g. python itself) - write and run simple scripts (standalone .py files) - run a REPL - install and use PyPI packages like requests or numpy - install and use tools like jupyter - their IDE should also be able to find these packages/tools
Over time, they'll probably end up with multiple scripts, and maybe want to organize them into subdirectories. The above should all work from subdirectories.
## 2. Sharing with others
Now we have a neat little script. Or maybe we've made a pretty jupyter notebook that computes some crucial business analytics. We want to share it with our friends or coworkers. We still need the features above; and now we also care about:
- version control - some way for our friend to reconstruct, on their computer: - the same PyPI packages that we were using - the same tools that we were using - the ways we invoked those tools
This last point is important: as projects grow in complexity, and are used by a wider audience, they often end up with fairly complex tool specifications that have to be shared among a team. For example:
- to run tests: in an environment that has pytest, pytest-cov, and pytest-trio installed, and with our project working directory on PYTHONPATH, run `pytest -Werror --cov ...` - to format code: in an environment using python 3.6 or later, that has black installed, run `black -l 79 *.py my-util-directory/*.py`
This kind of tool specification also puts us in a good position to set up CI when we reach that point.
At this point our project can grow in a few different directions.
## 3a. Deployable webapp
This adds the requirement to "deploy". I think this is mostly covered by the set-up-an-environment-to-run-a-command functionality already described? I'm not super familiar with this, but it's pipenv's core target, and pipenv doesn't have much more than that, so I assume that's about right...
## 3b. Reusable library
For this we also need to:
- Build sdists and wheels - Which means: pyproject.toml, and some way to invoke it - Install our library into our environments - Including dependency locking (best practice is to not pin dependencies in wheel metadata, but to pin all dependencies in CI; so there needs to be some way to track those separately, but integrated enough that it's not a huge ceremony to add or change a dependency)
## 3c. Reusable standalone app
I think this is pretty much like the "Reusable library", except that it'd be nice to have better tools to build/distribute standalone applications. But if we had them, we could invoke them the same way as we invoke other build systems?
# How do existing tools/proposals fit into this picture?
pyenv, virtualenv, and conda all solve parts of the "create an environment" problem, but consider the other aspects out-of-scope.
tox solves the problem of keeping a shared record of how to run a bunch of different tools in the appropriate environments, but doesn't handle pinning or procuring appropriate python versions, and requires a separate bootstrapping step to install tox.
`__pypackages__` (if implemented) makes it very easy for beginners to use PyPI packages in their own scripts and from the REPL; in particular, it would be part of python, so it meets the "install *one* thing" criterion. But, it doesn't provide any way to run tools. (There's no way to put `__pypackages__/bin` on PATH.) It doesn't allow scripts to be organized into subdirectories. (For security reasons, we can't have the python interpreter going off walking the filesystem looking for `__pypackages__/`, so the PEP specifies that `__pypackages__/` has to be in the same directory as the script that uses it.) There's no way to share your `__pypackages__` environment with a friend. So... it seems like a something that people would outgrow very quickly.
pipenv and poetry are interesting. Their basic strategy is to say, there is a top-level command that acts as your entry point to performing workflow actions on on a python project (`pipenv` or `poetry`, respectively). And this strategy at least in principle can solve the problems that `__pypackages__/` runs into. In particular, it doesn't rely on `$PATH`, so it can run tools; and because it's a dedicated project management tool, it can go looking for the project marker file.
# A fantastic elephant
So if our idealized user had an idealized tool, what would that look like?
They'll be interacting with Python through a dedicated tool, similar to pipenv or poetry. In my little fantasy here I'll call it `pyp`, because (a) I want to be neutral, (b) 6 characters is too long.
To get this tool, either they install Python (via python.org download, apt, homebrew, whatever), and the tool is automatically included. Or else, they install the tool directly, and it has the ability to install Python interpreters when needed.
Once they have the tool, they start by making a new directory for their project (this way they're ready to switch to version control later).
Then they somehow mark this directory as being a "python project root". I guess the UI would be something like `pyp new <name>` and it just does it for you, but we have to figure out what this creates on disk. We need some sort of marker file. Files that currently serve this kind of role include tox.ini, Pipfile, pyproject.toml, __pypackages__, ... But only one of these is a standard thing we're already committed to sticking with, so, pyproject.toml it is. Let's make it the marker for any python project, not just redistributable libraries. (And if we do grow up into a redistributable library, then we're already prepared.)
In the initial default configuration, there's a single default environment. You can install things with `pyp install ...` or `pyp uninstall ...`, and it tracks the requested packages in some standardized way in pyproject.toml, and also pins specific versions somewhere (could be pyproject.toml again I guess, or poetry's pyproject.lock would work too). This way when we decide to share our project later, our friends can recreate our environment on their system.
However, there's also the capability to configure multiple custom execution environments, including python version and installed packages. And the capability to configure new aliases like `pyp test` or `pyp reformat`, which run some specified command in a specified environment.
Since the install/locking metadata is all standardized, you can even switch between competing tools, and integrate with third-party tools like pyup.io.
For redistributable libraries, we also need some way to get the wheel metadata and the workflow metadata to play nicely together. Maybe this means that we need a standardized install-requires field in pyproject.toml, so that build backends and workflow tools have a shared source of truth?
# What's wrong with pipenv?
Since pipenv is the tool that those of us in the room were most familiar with, that comes closest to matching this vision, we brainstormed a list of complaints about it. Some of these are more reasonable than others.
- Not ambitious enough. This is a fuzzy sort of thing, but perception matters, and it's right there in the name: it's a tool to use pip, to manage an environment. If we're reconceiving this as the grand unified entryway to all of Python, then the name starts to feel pretty weird. The whole thing where it's only intended to work for webapp-style projects would have to change.
- Uses Pipfile as a project marker instead of pyproject.toml.
- Not shipped with Python. (Obviously not pipenv's fault, but nonetheless.)
- Environments should be stored in project directory, not off in $HOME somewhere. (Not sure what this is about, but some of the folks present were quite insistent.)
- Environments should be relocatable.
- Hardcoded to only support "default" and "dev" environments, which is insufficient.
- No mechanism for sharing prespecified commands like "run tests" or "reformat".
- Can't install Python. (There's... really no reason we *couldn't* distribute pre-built Python interpreters on PyPI? between the python.org installers and the manylinux image, we're already building redistributable run-anywhere binaries for the most popular platforms on every Python release; we just aren't zipping them up and putting them on PyPI.)
-n
-- Nathaniel J. Smith -- https://vorpus.org -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils- sig@python.org/message/YFJITQB37MZOPOFJJF3OAQOY4TOAFXYM/
On 01/10, 2018, at 00:47, Dan Ryan <dan@danryan.co> wrote:
Can't install Python. (There's... really no reason we *couldn't* distribute pre-built Python interpreters on PyPI? between the python.org installers and the manylinux image, we're already building redistributable run-anywhere binaries for the most popular platforms on every Python release; we just aren't zipping them up and putting them on PyPI.)
Erm, well actually you can install python currently via pyenv on linux, and Tzu-ping is the maintainer of a pyenv-clone on windows which we've just never really got around to integrating fully. I've spoken to some of the folks over at Anaconda and I know they are interested in this as well especially given that it's pretty straightforward. It hasn't been a primary focus lately, but the tooling does exist (I don't think I've ever used it personally though)
Regarding this specifically, my project is not actually a pyenv clone, since it is next to impossible to automate compilation of old Python versions on Windows. My project only automates binary download, installation, and configuration of binary releases from python.org <http://python.org/>. pyenv always compiling from source makes it quite flexible, but for an official tool, however, this is likely a better approach to only automate downloads from python.org <http://python.org/>. Official binary distributions are vastly underused, and this IMO has long produced fragmentation in the community. Almost all platform-specific Python distributors introduce their own quirks (Homebrew breaks all your virtual environments every time you upgrade Python, and don’t get me started with Debian). They feel “broken” when people hit specific use cases, and people blame Python when that happens for not “fixing” it. A standard (official?), automated runtime management tool a la rustup would help greatly with this situation, so we don’t need to constantly answer questions with the question “how did you install Python” and follow up by “oh that’s broken, but it’s not our fault, don’t use it”. This is probably out of the scope of distutils-sig though. TP
Anyway, this is all a good discussion to have and I really appreciate you kicking it off. I've been following the __pypackages__ conversation a bit since pycon and I honestly don't have much opinion about where we want to put stuff, but I'm not sure that the impact of the folder is going to be as great to the user as people might imagine -- the tooling is already being built, so maybe it's just a matter of agreeing on that as the place to put stuff, which schema to follow, and honestly working with some new users. I do this quite a bit but I haven't done any formal information gathering. Anecdotally I'll always tell you I'm right, but if we had some user data on specific pain points / usability issues I'd definitely be prepared to change my mind.
Dan Ryan gh: @techalchemy // e: dan@danryan.co
-----Original Message----- From: Nathaniel Smith [mailto:njs@pobox.com] Sent: Sunday, September 30, 2018 6:42 AM To: distutils-sig Subject: [Distutils] Notes from python core sprint on workflow tooling
Now that the basic wheels/pip/PyPI infrastructure is mostly functional, there's been a lot of interest in improving higher-level project workflow. We have a lot of powerful tools for this – virtualenv, pyenv, conda, tox, pipenv, poetry, ... – and more in development, like PEP 582 [1], which adds a support for project-local packages directories (`__pypackages__/`) directly to the interpreter.
But to me it feels like right now, Python workflow tools are like the blind men and the elephant [2]. Each group sees one part of the problem, and so we end up with one set of people building legs, another a trunk, a third some ears... and there's no overall plan for how they can fit together.
For example, PEP 582 is trying to solve the problem that virtualenv is really hard to use for beginners just starting out [3]. This is a serious problem! But I don't want a solution that *only* works for beginners starting out, so that once they get a little more sophisticated they have to throw it out and learn something new from scratch.
So I think now might be a time for a bit of top-down design. **I want a picture of the elephant.** If we had that, maybe we could see how all these different ideas could be put together into a coherent whole. So at the Python core sprint a few weeks ago, I dragged some interested parties [4] into a room with a whiteboard [5], and we made a start at it. And now I'm writing it up to share with you all.
This is very much a draft, intended as a seed for discussion, not a conclusion.
[1] https://www.python.org/dev/peps/pep-0582/ [2] https://en.wikipedia.org/wiki/Blind_men_and_an_elephant [3] https://www.python.org/dev/peps/pep-0582/#motivation [4] I won't try to list names, because I know I'll forget someone, and I don't know if everyone would agree with everything I wrote there. But thank you all! [5] https://photos.app.goo.gl/4HfY8P3ESPNi9oLMA, including special guest appearance by Kushal's elbow
# The idealized lifecycle of a Python project
## 1. Beginner
Everyone starts out as a rank beginner. This may be the first time they have programmed at all. At this stage, users want to:
- install *one* thing to get started (e.g. python itself) - write and run simple scripts (standalone .py files) - run a REPL - install and use PyPI packages like requests or numpy - install and use tools like jupyter - their IDE should also be able to find these packages/tools
Over time, they'll probably end up with multiple scripts, and maybe want to organize them into subdirectories. The above should all work from subdirectories.
## 2. Sharing with others
Now we have a neat little script. Or maybe we've made a pretty jupyter notebook that computes some crucial business analytics. We want to share it with our friends or coworkers. We still need the features above; and now we also care about:
- version control - some way for our friend to reconstruct, on their computer: - the same PyPI packages that we were using - the same tools that we were using - the ways we invoked those tools
This last point is important: as projects grow in complexity, and are used by a wider audience, they often end up with fairly complex tool specifications that have to be shared among a team. For example:
- to run tests: in an environment that has pytest, pytest-cov, and pytest-trio installed, and with our project working directory on PYTHONPATH, run `pytest -Werror --cov ...` - to format code: in an environment using python 3.6 or later, that has black installed, run `black -l 79 *.py my-util-directory/*.py`
This kind of tool specification also puts us in a good position to set up CI when we reach that point.
At this point our project can grow in a few different directions.
## 3a. Deployable webapp
This adds the requirement to "deploy". I think this is mostly covered by the set-up-an-environment-to-run-a-command functionality already described? I'm not super familiar with this, but it's pipenv's core target, and pipenv doesn't have much more than that, so I assume that's about right...
## 3b. Reusable library
For this we also need to:
- Build sdists and wheels - Which means: pyproject.toml, and some way to invoke it - Install our library into our environments - Including dependency locking (best practice is to not pin dependencies in wheel metadata, but to pin all dependencies in CI; so there needs to be some way to track those separately, but integrated enough that it's not a huge ceremony to add or change a dependency)
## 3c. Reusable standalone app
I think this is pretty much like the "Reusable library", except that it'd be nice to have better tools to build/distribute standalone applications. But if we had them, we could invoke them the same way as we invoke other build systems?
# How do existing tools/proposals fit into this picture?
pyenv, virtualenv, and conda all solve parts of the "create an environment" problem, but consider the other aspects out-of-scope.
tox solves the problem of keeping a shared record of how to run a bunch of different tools in the appropriate environments, but doesn't handle pinning or procuring appropriate python versions, and requires a separate bootstrapping step to install tox.
`__pypackages__` (if implemented) makes it very easy for beginners to use PyPI packages in their own scripts and from the REPL; in particular, it would be part of python, so it meets the "install *one* thing" criterion. But, it doesn't provide any way to run tools. (There's no way to put `__pypackages__/bin` on PATH.) It doesn't allow scripts to be organized into subdirectories. (For security reasons, we can't have the python interpreter going off walking the filesystem looking for `__pypackages__/`, so the PEP specifies that `__pypackages__/` has to be in the same directory as the script that uses it.) There's no way to share your `__pypackages__` environment with a friend. So... it seems like a something that people would outgrow very quickly.
pipenv and poetry are interesting. Their basic strategy is to say, there is a top-level command that acts as your entry point to performing workflow actions on on a python project (`pipenv` or `poetry`, respectively). And this strategy at least in principle can solve the problems that `__pypackages__/` runs into. In particular, it doesn't rely on `$PATH`, so it can run tools; and because it's a dedicated project management tool, it can go looking for the project marker file.
# A fantastic elephant
So if our idealized user had an idealized tool, what would that look like?
They'll be interacting with Python through a dedicated tool, similar to pipenv or poetry. In my little fantasy here I'll call it `pyp`, because (a) I want to be neutral, (b) 6 characters is too long.
To get this tool, either they install Python (via python.org download, apt, homebrew, whatever), and the tool is automatically included. Or else, they install the tool directly, and it has the ability to install Python interpreters when needed.
Once they have the tool, they start by making a new directory for their project (this way they're ready to switch to version control later).
Then they somehow mark this directory as being a "python project root". I guess the UI would be something like `pyp new <name>` and it just does it for you, but we have to figure out what this creates on disk. We need some sort of marker file. Files that currently serve this kind of role include tox.ini, Pipfile, pyproject.toml, __pypackages__, ... But only one of these is a standard thing we're already committed to sticking with, so, pyproject.toml it is. Let's make it the marker for any python project, not just redistributable libraries. (And if we do grow up into a redistributable library, then we're already prepared.)
In the initial default configuration, there's a single default environment. You can install things with `pyp install ...` or `pyp uninstall ...`, and it tracks the requested packages in some standardized way in pyproject.toml, and also pins specific versions somewhere (could be pyproject.toml again I guess, or poetry's pyproject.lock would work too). This way when we decide to share our project later, our friends can recreate our environment on their system.
However, there's also the capability to configure multiple custom execution environments, including python version and installed packages. And the capability to configure new aliases like `pyp test` or `pyp reformat`, which run some specified command in a specified environment.
Since the install/locking metadata is all standardized, you can even switch between competing tools, and integrate with third-party tools like pyup.io.
For redistributable libraries, we also need some way to get the wheel metadata and the workflow metadata to play nicely together. Maybe this means that we need a standardized install-requires field in pyproject.toml, so that build backends and workflow tools have a shared source of truth?
# What's wrong with pipenv?
Since pipenv is the tool that those of us in the room were most familiar with, that comes closest to matching this vision, we brainstormed a list of complaints about it. Some of these are more reasonable than others.
- Not ambitious enough. This is a fuzzy sort of thing, but perception matters, and it's right there in the name: it's a tool to use pip, to manage an environment. If we're reconceiving this as the grand unified entryway to all of Python, then the name starts to feel pretty weird. The whole thing where it's only intended to work for webapp-style projects would have to change.
- Uses Pipfile as a project marker instead of pyproject.toml.
- Not shipped with Python. (Obviously not pipenv's fault, but nonetheless.)
- Environments should be stored in project directory, not off in $HOME somewhere. (Not sure what this is about, but some of the folks present were quite insistent.)
- Environments should be relocatable.
- Hardcoded to only support "default" and "dev" environments, which is insufficient.
- No mechanism for sharing prespecified commands like "run tests" or "reformat".
- Can't install Python. (There's... really no reason we *couldn't* distribute pre-built Python interpreters on PyPI? between the python.org installers and the manylinux image, we're already building redistributable run-anywhere binaries for the most popular platforms on every Python release; we just aren't zipping them up and putting them on PyPI.)
-n
-- Nathaniel J. Smith -- https://vorpus.org -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils- sig@python.org/message/YFJITQB37MZOPOFJJF3OAQOY4TOAFXYM/ -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/O...
On 01/10, 2018, at 00:47, Dan Ryan <dan@danryan.co> wrote:
Uses Pipfile as a project marker instead of pyproject.toml.
See above. pyproject.toml wasn't standardized yet when pipenv was released (and still isn't, beyond that it is a file that could exist and store information). Pipfile was intended to replace requirements.txt per some previous thread on the topic, and pipenv was an experimental implementation of the separation between the two different ways that people currently use requirements.txt in the wild -- one as a kind of abstract, unpinned dependency list (Pipfile), and the other as a transitive closure (Pipfile.lock). Since neither is standardized _for applications_, I'm not totally sure this is an actual sticking point.
In either case, this seems super minor…
I feel this would need to be extensively discussed either way before the community can jump into a decision. The discussion I’ve seen has been quite split on whether we should use one file or the other, but nothing very explaining why outside of “one file is better than two”. To me, there are two main things one would want to specify dependencies for: a thing you want to import (as a Python library from another Python source file), and a thing you want to run (as a command, a webapp, etc.). These setups require inherently different ways to specify dependencies; sometime they match, but not always, and a tool would (eventually) need to provide a solution when they diverge (i.e. when a project needs to be both importable and runnable, and has different dependency sets for them). Of course, the solution to this may as well be to always use pyproject.toml, and to design it to fit both scenarios. In this case, however, the fields need to be designed carefully to make sure all areas are taken care of. NPM, for example, requires you to specify a (package) name for your project, even if it never needs it (e.g. a website backend), which to me is a sign of designing too much toward package distribution, not standalone runnable. Other tools using a one-file-of-truth configuration has a similar problem to a degree, as far as I can tell. Another example would be Rust’s Cargo, which cannot specify a binary-only dependency. A duel-file configuration (e.g. Pipfile and pyproject.toml) would essentially punt this design decision; it is probably not the most purest solution, but at least keeps things open enough they don’t collide. Maybe another “solution” would be to have multiple (two?) files for each use case, but have a tool for syncing them automatically if desired. This is how Bundler works if you use it to package a Ruby Gem, but I am not sure if that is by design or an artefact of circumstances (Gem specification predates Bundler like how Python packages appear before all-in-one project management tools start to appear). I don’t have much to provide at the current time regarding what the best design should look like, but want to voice my concerns before it is too late. TP
Not shipped with Python. (Obviously not pipenv's fault, but nonetheless.) (Not sure it should be)
Environments should be stored in project directory, not off in $HOME somewhere. (Not sure what this is about, but some of the folks present were quite insistent.)
They used to be by default stored in $PROJECT/.venv but user feedback led us to use $WORKON_HOME by default. This is configurable by environment variable ($PIPENV_VENV_IN_PROJECT) or if you simply have a virtualenv in the .venv folder in your project directory.
Environments should be relocatable.
And that will be possible whenever we can use venv across platforms and python versions. Currently that isn't possible, and we are forced to use virtualenv for compatibility.
Hardcoded to only support "default" and "dev" environments, which is insufficient.
Why? I mean, if you are planning to replace setuptools / flit / other build systems with pipenv and use pipfile as your new specification for declaring extras, I guess, but that's not how it's designed currently. Beyond that, I think we need some actual information on this one -- adding more complexity to any tool (including this kind of complexity) is going to ask more of the user in terms of frontloaded knowledge. This constraint limits the space a bit and for applications, I've very rarely seen actual limitations of this setup (but am interested, we have obviously had this feedback before but are not eager to add knobs in this specific area).
No mechanism for sharing prespecified commands like "run tests" or "reformat".
There is, but the documentation on the topic is not very thorough: https://pipenv.readthedocs.io/en/latest/advanced/#custom-script-shortcuts See also: https://github.com/sarugaku/requirementslib/blob/master/Pipfile#L26
For an example for the specific cases you mentioned, the Pipfile entry in that project looks like this: [scripts] black = 'black src/requirementslib/ --exclude "/(\.git|\.hg|\.mypy_cache|\.tox|\.venv|_build|buck-out|build|dist)/"' tests = "pytest -v --ignore=src/requirementslib/_vendor/ tests"
Can't install Python. (There's... really no reason we *couldn't* distribute pre-built Python interpreters on PyPI? between the python.org installers and the manylinux image, we're already building redistributable run-anywhere binaries for the most popular platforms on every Python release; we just aren't zipping them up and putting them on PyPI.)
Erm, well actually you can install python currently via pyenv on linux, and Tzu-ping is the maintainer of a pyenv-clone on windows which we've just never really got around to integrating fully. I've spoken to some of the folks over at Anaconda and I know they are interested in this as well especially given that it's pretty straightforward. It hasn't been a primary focus lately, but the tooling does exist (I don't think I've ever used it personally though)
Anyway, this is all a good discussion to have and I really appreciate you kicking it off. I've been following the __pypackages__ conversation a bit since pycon and I honestly don't have much opinion about where we want to put stuff, but I'm not sure that the impact of the folder is going to be as great to the user as people might imagine -- the tooling is already being built, so maybe it's just a matter of agreeing on that as the place to put stuff, which schema to follow, and honestly working with some new users. I do this quite a bit but I haven't done any formal information gathering. Anecdotally I'll always tell you I'm right, but if we had some user data on specific pain points / usability issues I'd definitely be prepared to change my mind.
Dan Ryan gh: @techalchemy // e: dan@danryan.co
-----Original Message----- From: Nathaniel Smith [mailto:njs@pobox.com] Sent: Sunday, September 30, 2018 6:42 AM To: distutils-sig Subject: [Distutils] Notes from python core sprint on workflow tooling
Now that the basic wheels/pip/PyPI infrastructure is mostly functional, there's been a lot of interest in improving higher-level project workflow. We have a lot of powerful tools for this – virtualenv, pyenv, conda, tox, pipenv, poetry, ... – and more in development, like PEP 582 [1], which adds a support for project-local packages directories (`__pypackages__/`) directly to the interpreter.
But to me it feels like right now, Python workflow tools are like the blind men and the elephant [2]. Each group sees one part of the problem, and so we end up with one set of people building legs, another a trunk, a third some ears... and there's no overall plan for how they can fit together.
For example, PEP 582 is trying to solve the problem that virtualenv is really hard to use for beginners just starting out [3]. This is a serious problem! But I don't want a solution that *only* works for beginners starting out, so that once they get a little more sophisticated they have to throw it out and learn something new from scratch.
So I think now might be a time for a bit of top-down design. **I want a picture of the elephant.** If we had that, maybe we could see how all these different ideas could be put together into a coherent whole. So at the Python core sprint a few weeks ago, I dragged some interested parties [4] into a room with a whiteboard [5], and we made a start at it. And now I'm writing it up to share with you all.
This is very much a draft, intended as a seed for discussion, not a conclusion.
[1] https://www.python.org/dev/peps/pep-0582/ [2] https://en.wikipedia.org/wiki/Blind_men_and_an_elephant [3] https://www.python.org/dev/peps/pep-0582/#motivation [4] I won't try to list names, because I know I'll forget someone, and I don't know if everyone would agree with everything I wrote there. But thank you all! [5] https://photos.app.goo.gl/4HfY8P3ESPNi9oLMA, including special guest appearance by Kushal's elbow
# The idealized lifecycle of a Python project
## 1. Beginner
Everyone starts out as a rank beginner. This may be the first time they have programmed at all. At this stage, users want to:
- install *one* thing to get started (e.g. python itself) - write and run simple scripts (standalone .py files) - run a REPL - install and use PyPI packages like requests or numpy - install and use tools like jupyter - their IDE should also be able to find these packages/tools
Over time, they'll probably end up with multiple scripts, and maybe want to organize them into subdirectories. The above should all work from subdirectories.
## 2. Sharing with others
Now we have a neat little script. Or maybe we've made a pretty jupyter notebook that computes some crucial business analytics. We want to share it with our friends or coworkers. We still need the features above; and now we also care about:
- version control - some way for our friend to reconstruct, on their computer: - the same PyPI packages that we were using - the same tools that we were using - the ways we invoked those tools
This last point is important: as projects grow in complexity, and are used by a wider audience, they often end up with fairly complex tool specifications that have to be shared among a team. For example:
- to run tests: in an environment that has pytest, pytest-cov, and pytest-trio installed, and with our project working directory on PYTHONPATH, run `pytest -Werror --cov ...` - to format code: in an environment using python 3.6 or later, that has black installed, run `black -l 79 *.py my-util-directory/*.py`
This kind of tool specification also puts us in a good position to set up CI when we reach that point.
At this point our project can grow in a few different directions.
## 3a. Deployable webapp
This adds the requirement to "deploy". I think this is mostly covered by the set-up-an-environment-to-run-a-command functionality already described? I'm not super familiar with this, but it's pipenv's core target, and pipenv doesn't have much more than that, so I assume that's about right...
## 3b. Reusable library
For this we also need to:
- Build sdists and wheels - Which means: pyproject.toml, and some way to invoke it - Install our library into our environments - Including dependency locking (best practice is to not pin dependencies in wheel metadata, but to pin all dependencies in CI; so there needs to be some way to track those separately, but integrated enough that it's not a huge ceremony to add or change a dependency)
## 3c. Reusable standalone app
I think this is pretty much like the "Reusable library", except that it'd be nice to have better tools to build/distribute standalone applications. But if we had them, we could invoke them the same way as we invoke other build systems?
# How do existing tools/proposals fit into this picture?
pyenv, virtualenv, and conda all solve parts of the "create an environment" problem, but consider the other aspects out-of-scope.
tox solves the problem of keeping a shared record of how to run a bunch of different tools in the appropriate environments, but doesn't handle pinning or procuring appropriate python versions, and requires a separate bootstrapping step to install tox.
`__pypackages__` (if implemented) makes it very easy for beginners to use PyPI packages in their own scripts and from the REPL; in particular, it would be part of python, so it meets the "install *one* thing" criterion. But, it doesn't provide any way to run tools. (There's no way to put `__pypackages__/bin` on PATH.) It doesn't allow scripts to be organized into subdirectories. (For security reasons, we can't have the python interpreter going off walking the filesystem looking for `__pypackages__/`, so the PEP specifies that `__pypackages__/` has to be in the same directory as the script that uses it.) There's no way to share your `__pypackages__` environment with a friend. So... it seems like a something that people would outgrow very quickly.
pipenv and poetry are interesting. Their basic strategy is to say, there is a top-level command that acts as your entry point to performing workflow actions on on a python project (`pipenv` or `poetry`, respectively). And this strategy at least in principle can solve the problems that `__pypackages__/` runs into. In particular, it doesn't rely on `$PATH`, so it can run tools; and because it's a dedicated project management tool, it can go looking for the project marker file.
# A fantastic elephant
So if our idealized user had an idealized tool, what would that look like?
They'll be interacting with Python through a dedicated tool, similar to pipenv or poetry. In my little fantasy here I'll call it `pyp`, because (a) I want to be neutral, (b) 6 characters is too long.
To get this tool, either they install Python (via python.org download, apt, homebrew, whatever), and the tool is automatically included. Or else, they install the tool directly, and it has the ability to install Python interpreters when needed.
Once they have the tool, they start by making a new directory for their project (this way they're ready to switch to version control later).
Then they somehow mark this directory as being a "python project root". I guess the UI would be something like `pyp new <name>` and it just does it for you, but we have to figure out what this creates on disk. We need some sort of marker file. Files that currently serve this kind of role include tox.ini, Pipfile, pyproject.toml, __pypackages__, ... But only one of these is a standard thing we're already committed to sticking with, so, pyproject.toml it is. Let's make it the marker for any python project, not just redistributable libraries. (And if we do grow up into a redistributable library, then we're already prepared.)
In the initial default configuration, there's a single default environment. You can install things with `pyp install ...` or `pyp uninstall ...`, and it tracks the requested packages in some standardized way in pyproject.toml, and also pins specific versions somewhere (could be pyproject.toml again I guess, or poetry's pyproject.lock would work too). This way when we decide to share our project later, our friends can recreate our environment on their system.
However, there's also the capability to configure multiple custom execution environments, including python version and installed packages. And the capability to configure new aliases like `pyp test` or `pyp reformat`, which run some specified command in a specified environment.
Since the install/locking metadata is all standardized, you can even switch between competing tools, and integrate with third-party tools like pyup.io.
For redistributable libraries, we also need some way to get the wheel metadata and the workflow metadata to play nicely together. Maybe this means that we need a standardized install-requires field in pyproject.toml, so that build backends and workflow tools have a shared source of truth?
# What's wrong with pipenv?
Since pipenv is the tool that those of us in the room were most familiar with, that comes closest to matching this vision, we brainstormed a list of complaints about it. Some of these are more reasonable than others.
- Not ambitious enough. This is a fuzzy sort of thing, but perception matters, and it's right there in the name: it's a tool to use pip, to manage an environment. If we're reconceiving this as the grand unified entryway to all of Python, then the name starts to feel pretty weird. The whole thing where it's only intended to work for webapp-style projects would have to change.
- Uses Pipfile as a project marker instead of pyproject.toml.
- Not shipped with Python. (Obviously not pipenv's fault, but nonetheless.)
- Environments should be stored in project directory, not off in $HOME somewhere. (Not sure what this is about, but some of the folks present were quite insistent.)
- Environments should be relocatable.
- Hardcoded to only support "default" and "dev" environments, which is insufficient.
- No mechanism for sharing prespecified commands like "run tests" or "reformat".
- Can't install Python. (There's... really no reason we *couldn't* distribute pre-built Python interpreters on PyPI? between the python.org installers and the manylinux image, we're already building redistributable run-anywhere binaries for the most popular platforms on every Python release; we just aren't zipping them up and putting them on PyPI.)
-n
-- Nathaniel J. Smith -- https://vorpus.org -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils- sig@python.org/message/YFJITQB37MZOPOFJJF3OAQOY4TOAFXYM/ -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/O...
On Sun, 30 Sep 2018 at 20:50, Tzu-ping Chung <uranusjr@gmail.com> wrote:
On 01/10, 2018, at 00:47, Dan Ryan <dan@danryan.co> wrote:
Uses Pipfile as a project marker instead of pyproject.toml.
See above. pyproject.toml wasn't standardized yet when pipenv was released (and still isn't, beyond that it is a file that could exist and store information). Pipfile was intended to replace requirements.txt per some previous thread on the topic, and pipenv was an experimental implementation of the separation between the two different ways that people currently use requirements.txt in the wild -- one as a kind of abstract, unpinned dependency list (Pipfile), and the other as a transitive closure (Pipfile.lock). Since neither is standardized _for applications_, I'm not totally sure this is an actual sticking point.
In either case, this seems super minor…
I feel this would need to be extensively discussed either way before the community can jump into a decision. The discussion I’ve seen has been quite split on whether we should use one file or the other, but nothing very explaining why outside of “one file is better than two”.
This discussion seems to have diverted into being about pipenv. Can I ask that the pipenv-specific discussions be split out into a different thread? (For example, I'm not clear if Tzu-Ping's comment here is specific to pipenv or not). My main reason is that (as I noted in my reply to Nathaniel's post) my use cases are, as far as I can tell, *not* suitable for pipenv as it's currently targeted (I'm willing to be informed otherwise, but please, can we do it on another thread or off-list if it's not generally useful). And I'd rather that we kept the central discussion tool-agnostic until we come to some view on what tools we'd expect to be suggesting to users in the various categories we end up identifying. Thanks, Paul
I didn’t intend my comments to be specific to Pipenv, but it is about Pipfile being considered why Pipenv is not suitable. Whether different kinds of projects should share one configuration file is an important but less addressed design decision, and the decision is not yet made. Considering Pipfile as a project marker instead of pyproject.toml as a complaint is jumping into a particular decision, and would risk skipping this discussion IMO. TP
On 01/10, 2018, at 03:56, Paul Moore <p.f.moore@gmail.com> wrote:
On Sun, 30 Sep 2018 at 20:50, Tzu-ping Chung <uranusjr@gmail.com <mailto:uranusjr@gmail.com>> wrote:
On 01/10, 2018, at 00:47, Dan Ryan <dan@danryan.co> wrote:
Uses Pipfile as a project marker instead of pyproject.toml.
See above. pyproject.toml wasn't standardized yet when pipenv was released (and still isn't, beyond that it is a file that could exist and store information). Pipfile was intended to replace requirements.txt per some previous thread on the topic, and pipenv was an experimental implementation of the separation between the two different ways that people currently use requirements.txt in the wild -- one as a kind of abstract, unpinned dependency list (Pipfile), and the other as a transitive closure (Pipfile.lock). Since neither is standardized _for applications_, I'm not totally sure this is an actual sticking point.
In either case, this seems super minor…
I feel this would need to be extensively discussed either way before the community can jump into a decision. The discussion I’ve seen has been quite split on whether we should use one file or the other, but nothing very explaining why outside of “one file is better than two”.
This discussion seems to have diverted into being about pipenv. Can I ask that the pipenv-specific discussions be split out into a different thread? (For example, I'm not clear if Tzu-Ping's comment here is specific to pipenv or not).
My main reason is that (as I noted in my reply to Nathaniel's post) my use cases are, as far as I can tell, *not* suitable for pipenv as it's currently targeted (I'm willing to be informed otherwise, but please, can we do it on another thread or off-list if it's not generally useful). And I'd rather that we kept the central discussion tool-agnostic until we come to some view on what tools we'd expect to be suggesting to users in the various categories we end up identifying.
Thanks, Paul
Pipfile is not pipenv, and the original thread specifically discussed the pipenv implementation of the identified needs -- since pipenv is in wide use, even if you personally don't like or use it, it seemed helpful to discuss the limitations. Tzu-ping went ahead and expanded the discussion about the distinction between application and library usage which actually _IS_ central to the entire conversation, and barely mentioned pipenv at all in his discussion about the tradeoffs between various approaches to specifying dependencies as a user. Since pipenv actually has implemented one of these approaches which specifically targets the distinction, effectively drawing a line between libraries and applications, it is particularly relevant as a point of discussion in the conversation about "new user experience" in the ecosystem of people who are sitting down for the first time and trying to set up a virtual environment, install dependencies, install python, etc. If you keep rushing to tell us to go away and talk about things off list rather than trying to understand the relevance of what we're trying to talk about, we will never have a productive conversation. I get that your attention is split, mine is too. But we are going to have to talk about specific tools in order to evaluate the tradeoffs they make and you may need to accept that even though you personally have taken an active position of trying to make us leave you alone, many people use pipenv in the python community and it may actually be a good starting point for discussing this kind of a problem. Given that we are talking 'one tool vs many tools' it seems like a good idea to look at how other languages handle these problems, including (and probably starting with) what is possibly the core decision that would need to be made before you could even start standardizing -- do you want libraries and applications managed by the same workflow, or not. Is that not a conversation that we want to have? If not, what conversation topics are we allowed to address in this discussion? Dan Ryan gh: @techalchemy // e: dan@danryan.co
-----Original Message----- From: Paul Moore [mailto:p.f.moore@gmail.com] Sent: Sunday, September 30, 2018 3:57 PM To: Tzu-ping Chung Cc: Distutils Subject: [Distutils] Re: Notes from python core sprint on workflow tooling
On Sun, 30 Sep 2018 at 20:50, Tzu-ping Chung <uranusjr@gmail.com> wrote:
On 01/10, 2018, at 00:47, Dan Ryan <dan@danryan.co> wrote:
Uses Pipfile as a project marker instead of pyproject.toml.
See above. pyproject.toml wasn't standardized yet when pipenv was
released (and still isn't, beyond that it is a file that could exist and store information). Pipfile was intended to replace requirements.txt per some previous thread on the topic, and pipenv was an experimental implementation of the separation between the two different ways that people currently use requirements.txt in the wild -- one as a kind of abstract, unpinned dependency list (Pipfile), and the other as a transitive closure (Pipfile.lock). Since neither is standardized _for applications_, I'm not totally sure this is an actual sticking point.
In either case, this seems super minor…
I feel this would need to be extensively discussed either way before the community can jump into a decision. The discussion I’ve seen has been quite split on whether we should use one file or the other, but nothing very explaining why outside of “one file is better than two”.
This discussion seems to have diverted into being about pipenv. Can I ask that the pipenv-specific discussions be split out into a different thread? (For example, I'm not clear if Tzu-Ping's comment here is specific to pipenv or not).
My main reason is that (as I noted in my reply to Nathaniel's post) my use cases are, as far as I can tell, *not* suitable for pipenv as it's currently targeted (I'm willing to be informed otherwise, but please, can we do it on another thread or off-list if it's not generally useful). And I'd rather that we kept the central discussion tool-agnostic until we come to some view on what tools we'd expect to be suggesting to users in the various categories we end up identifying.
Thanks, Paul -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils- sig@python.org/message/ZMWZ4FDME7W5LK2T2DCBAIJFP7L3TSMW/
In reading this discussion, I feel like a cool picture would be a Venn diagram of several of the common tools out there, with dots (or some other type of regions) to represent the various use cases they do or don't support. --Chris On Sun, Sep 30, 2018 at 1:46 PM Dan Ryan <dan@danryan.co> wrote:
Pipfile is not pipenv, and the original thread specifically discussed the pipenv implementation of the identified needs -- since pipenv is in wide use, even if you personally don't like or use it, it seemed helpful to discuss the limitations.
Tzu-ping went ahead and expanded the discussion about the distinction between application and library usage which actually _IS_ central to the entire conversation, and barely mentioned pipenv at all in his discussion about the tradeoffs between various approaches to specifying dependencies as a user.
Since pipenv actually has implemented one of these approaches which specifically targets the distinction, effectively drawing a line between libraries and applications, it is particularly relevant as a point of discussion in the conversation about "new user experience" in the ecosystem of people who are sitting down for the first time and trying to set up a virtual environment, install dependencies, install python, etc. If you keep rushing to tell us to go away and talk about things off list rather than trying to understand the relevance of what we're trying to talk about, we will never have a productive conversation.
I get that your attention is split, mine is too. But we are going to have to talk about specific tools in order to evaluate the tradeoffs they make and you may need to accept that even though you personally have taken an active position of trying to make us leave you alone, many people use pipenv in the python community and it may actually be a good starting point for discussing this kind of a problem.
Given that we are talking 'one tool vs many tools' it seems like a good idea to look at how other languages handle these problems, including (and probably starting with) what is possibly the core decision that would need to be made before you could even start standardizing -- do you want libraries and applications managed by the same workflow, or not. Is that not a conversation that we want to have? If not, what conversation topics are we allowed to address in this discussion?
Dan Ryan gh: @techalchemy // e: dan@danryan.co
-----Original Message----- From: Paul Moore [mailto:p.f.moore@gmail.com] Sent: Sunday, September 30, 2018 3:57 PM To: Tzu-ping Chung Cc: Distutils Subject: [Distutils] Re: Notes from python core sprint on workflow tooling
On Sun, 30 Sep 2018 at 20:50, Tzu-ping Chung <uranusjr@gmail.com> wrote:
On 01/10, 2018, at 00:47, Dan Ryan <dan@danryan.co> wrote:
Uses Pipfile as a project marker instead of pyproject.toml.
See above. pyproject.toml wasn't standardized yet when pipenv was
released (and still isn't, beyond that it is a file that could exist and store information). Pipfile was intended to replace requirements.txt per some previous thread on the topic, and pipenv was an experimental implementation of the separation between the two different ways that people currently use requirements.txt in the wild -- one as a kind of abstract, unpinned dependency list (Pipfile), and the other as a transitive closure (Pipfile.lock). Since neither is standardized _for applications_, I'm not totally sure this is an actual sticking point.
In either case, this seems super minor…
I feel this would need to be extensively discussed either way before the community can jump into a decision. The discussion I’ve seen has been quite split on whether we should use one file or the other, but nothing very explaining why outside of “one file is better than two”.
This discussion seems to have diverted into being about pipenv. Can I ask that the pipenv-specific discussions be split out into a different thread? (For example, I'm not clear if Tzu-Ping's comment here is specific to pipenv or not).
My main reason is that (as I noted in my reply to Nathaniel's post) my use cases are, as far as I can tell, *not* suitable for pipenv as it's currently targeted (I'm willing to be informed otherwise, but please, can we do it on another thread or off-list if it's not generally useful). And I'd rather that we kept the central discussion tool-agnostic until we come to some view on what tools we'd expect to be suggesting to users in the various categories we end up identifying.
Thanks, Paul -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils- sig@python.org/message/ZMWZ4FDME7W5LK2T2DCBAIJFP7L3TSMW/ -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/U...
On Sun, 30 Sep 2018 at 22:17, Chris Jerdonek <chris.jerdonek@gmail.com> wrote:
In reading this discussion, I feel like a cool picture would be a Venn diagram of several of the common tools out there, with dots (or some other type of regions) to represent the various use cases they do or don't support.
Yeah, that would be useful. Picture, 1000 words and all that :-) Paul
Hi, I'm sorry, but I'm not going to respond to this message. For some time now I've been considering taking a break from open source mailing lists, as I'm finding that the frustration involved in dealing with some of the more confrontational threads (until now, mostly on other lists than this one) has been affecting my personal life. So as of now, I'm on a break for the period of October. One final thing I will say is that I find comments like "rushing to tell us to go away and talk about things off list rather than trying to understand the relevance of what we're trying to talk about" and "you personally have taken an active position of trying to make us leave you alone" is a startling and pretty aggressive misinterpretation of what I was trying to say. I'm going to assume your comments were in good faith, and that I simply didn't explain myself well enough, or you misunderstood what I was saying, but I will say that whatever the reason, these comments were fairly key in convincing me that I don't have to take this sort of thing in what's supposed to be a fun hobby activity. I hope the discussion goes well, and I'll check back in in November to see where it has led. Paul On Sun, 30 Sep 2018 at 21:43, Dan Ryan <dan@danryan.co> wrote:
Pipfile is not pipenv, and the original thread specifically discussed the pipenv implementation of the identified needs -- since pipenv is in wide use, even if you personally don't like or use it, it seemed helpful to discuss the limitations.
Tzu-ping went ahead and expanded the discussion about the distinction between application and library usage which actually _IS_ central to the entire conversation, and barely mentioned pipenv at all in his discussion about the tradeoffs between various approaches to specifying dependencies as a user.
Since pipenv actually has implemented one of these approaches which specifically targets the distinction, effectively drawing a line between libraries and applications, it is particularly relevant as a point of discussion in the conversation about "new user experience" in the ecosystem of people who are sitting down for the first time and trying to set up a virtual environment, install dependencies, install python, etc. If you keep rushing to tell us to go away and talk about things off list rather than trying to understand the relevance of what we're trying to talk about, we will never have a productive conversation.
I get that your attention is split, mine is too. But we are going to have to talk about specific tools in order to evaluate the tradeoffs they make and you may need to accept that even though you personally have taken an active position of trying to make us leave you alone, many people use pipenv in the python community and it may actually be a good starting point for discussing this kind of a problem.
Given that we are talking 'one tool vs many tools' it seems like a good idea to look at how other languages handle these problems, including (and probably starting with) what is possibly the core decision that would need to be made before you could even start standardizing -- do you want libraries and applications managed by the same workflow, or not. Is that not a conversation that we want to have? If not, what conversation topics are we allowed to address in this discussion?
Dan Ryan gh: @techalchemy // e: dan@danryan.co
-----Original Message----- From: Paul Moore [mailto:p.f.moore@gmail.com] Sent: Sunday, September 30, 2018 3:57 PM To: Tzu-ping Chung Cc: Distutils Subject: [Distutils] Re: Notes from python core sprint on workflow tooling
On Sun, 30 Sep 2018 at 20:50, Tzu-ping Chung <uranusjr@gmail.com> wrote:
On 01/10, 2018, at 00:47, Dan Ryan <dan@danryan.co> wrote:
Uses Pipfile as a project marker instead of pyproject.toml.
See above. pyproject.toml wasn't standardized yet when pipenv was
released (and still isn't, beyond that it is a file that could exist and store information). Pipfile was intended to replace requirements.txt per some previous thread on the topic, and pipenv was an experimental implementation of the separation between the two different ways that people currently use requirements.txt in the wild -- one as a kind of abstract, unpinned dependency list (Pipfile), and the other as a transitive closure (Pipfile.lock). Since neither is standardized _for applications_, I'm not totally sure this is an actual sticking point.
In either case, this seems super minor…
I feel this would need to be extensively discussed either way before the community can jump into a decision. The discussion I’ve seen has been quite split on whether we should use one file or the other, but nothing very explaining why outside of “one file is better than two”.
This discussion seems to have diverted into being about pipenv. Can I ask that the pipenv-specific discussions be split out into a different thread? (For example, I'm not clear if Tzu-Ping's comment here is specific to pipenv or not).
My main reason is that (as I noted in my reply to Nathaniel's post) my use cases are, as far as I can tell, *not* suitable for pipenv as it's currently targeted (I'm willing to be informed otherwise, but please, can we do it on another thread or off-list if it's not generally useful). And I'd rather that we kept the central discussion tool-agnostic until we come to some view on what tools we'd expect to be suggesting to users in the various categories we end up identifying.
Thanks, Paul -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils- sig@python.org/message/ZMWZ4FDME7W5LK2T2DCBAIJFP7L3TSMW/
Sorry for causing you additional frustration. I was also frustrated and underscoring a sense that seems to come through in most of our interactions for a number of reasons. It isn’t my intention to cause you additional grief, so I apologize for fanning the flames. Dan Ryan // pipenv maintainer gh: @techalchemy
On Oct 1, 2018, at 4:50 AM, Paul Moore <p.f.moore@gmail.com> wrote:
Hi, I'm sorry, but I'm not going to respond to this message. For some time now I've been considering taking a break from open source mailing lists, as I'm finding that the frustration involved in dealing with some of the more confrontational threads (until now, mostly on other lists than this one) has been affecting my personal life. So as of now, I'm on a break for the period of October.
One final thing I will say is that I find comments like "rushing to tell us to go away and talk about things off list rather than trying to understand the relevance of what we're trying to talk about" and "you personally have taken an active position of trying to make us leave you alone" is a startling and pretty aggressive misinterpretation of what I was trying to say. I'm going to assume your comments were in good faith, and that I simply didn't explain myself well enough, or you misunderstood what I was saying, but I will say that whatever the reason, these comments were fairly key in convincing me that I don't have to take this sort of thing in what's supposed to be a fun hobby activity.
I hope the discussion goes well, and I'll check back in in November to see where it has led.
Paul
On Sun, 30 Sep 2018 at 21:43, Dan Ryan <dan@danryan.co> wrote:
Pipfile is not pipenv, and the original thread specifically discussed the pipenv implementation of the identified needs -- since pipenv is in wide use, even if you personally don't like or use it, it seemed helpful to discuss the limitations.
Tzu-ping went ahead and expanded the discussion about the distinction between application and library usage which actually _IS_ central to the entire conversation, and barely mentioned pipenv at all in his discussion about the tradeoffs between various approaches to specifying dependencies as a user.
Since pipenv actually has implemented one of these approaches which specifically targets the distinction, effectively drawing a line between libraries and applications, it is particularly relevant as a point of discussion in the conversation about "new user experience" in the ecosystem of people who are sitting down for the first time and trying to set up a virtual environment, install dependencies, install python, etc. If you keep rushing to tell us to go away and talk about things off list rather than trying to understand the relevance of what we're trying to talk about, we will never have a productive conversation.
I get that your attention is split, mine is too. But we are going to have to talk about specific tools in order to evaluate the tradeoffs they make and you may need to accept that even though you personally have taken an active position of trying to make us leave you alone, many people use pipenv in the python community and it may actually be a good starting point for discussing this kind of a problem.
Given that we are talking 'one tool vs many tools' it seems like a good idea to look at how other languages handle these problems, including (and probably starting with) what is possibly the core decision that would need to be made before you could even start standardizing -- do you want libraries and applications managed by the same workflow, or not. Is that not a conversation that we want to have? If not, what conversation topics are we allowed to address in this discussion?
Dan Ryan gh: @techalchemy // e: dan@danryan.co
-----Original Message----- From: Paul Moore [mailto:p.f.moore@gmail.com] Sent: Sunday, September 30, 2018 3:57 PM To: Tzu-ping Chung Cc: Distutils Subject: [Distutils] Re: Notes from python core sprint on workflow tooling
On Sun, 30 Sep 2018 at 20:50, Tzu-ping Chung <uranusjr@gmail.com> wrote:
On 01/10, 2018, at 00:47, Dan Ryan <dan@danryan.co> wrote:
Uses Pipfile as a project marker instead of pyproject.toml.
See above. pyproject.toml wasn't standardized yet when pipenv was released (and still isn't, beyond that it is a file that could exist and store information). Pipfile was intended to replace requirements.txt per some previous thread on the topic, and pipenv was an experimental implementation of the separation between the two different ways that people currently use requirements.txt in the wild -- one as a kind of abstract, unpinned dependency list (Pipfile), and the other as a transitive closure (Pipfile.lock). Since neither is standardized _for applications_, I'm not totally sure this is an actual sticking point.
In either case, this seems super minor…
I feel this would need to be extensively discussed either way before the community can jump into a decision. The discussion I’ve seen has been quite split on whether we should use one file or the other, but nothing very explaining why outside of “one file is better than two”.
This discussion seems to have diverted into being about pipenv. Can I ask that the pipenv-specific discussions be split out into a different thread? (For example, I'm not clear if Tzu-Ping's comment here is specific to pipenv or not).
My main reason is that (as I noted in my reply to Nathaniel's post) my use cases are, as far as I can tell, *not* suitable for pipenv as it's currently targeted (I'm willing to be informed otherwise, but please, can we do it on another thread or off-list if it's not generally useful). And I'd rather that we kept the central discussion tool-agnostic until we come to some view on what tools we'd expect to be suggesting to users in the various categories we end up identifying.
Thanks, Paul -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils- sig@python.org/message/ZMWZ4FDME7W5LK2T2DCBAIJFP7L3TSMW/
[Splitting off a new thread for this question even if it might not result in a discussion] On Sun, Sep 30, 2018 at 10:00 AM Dan Ryan <dan@danryan.co> wrote:
Anyway, this is all a good discussion to have and I really appreciate you kicking it off. I've been following the __pypackages__ conversation a bit since pycon and I honestly don't have much opinion about where we want to put stuff, but I'm not sure that the impact of the folder is going to be as great to the user as people might imagine
Where is this conversation happening, by the way? I'm surprised I didn't know about it until Nathaniel mentioned it when he started his thread -- since I'm on a bunch of lists (python-dev, Distutils-SIG, etc). --Chris
On Sun, Sep 30, 2018 at 2:25 PM, Chris Jerdonek <chris.jerdonek@gmail.com> wrote:
[Splitting off a new thread for this question even if it might not result in a discussion]
On Sun, Sep 30, 2018 at 10:00 AM Dan Ryan <dan@danryan.co> wrote:
Anyway, this is all a good discussion to have and I really appreciate you kicking it off. I've been following the __pypackages__ conversation a bit since pycon and I honestly don't have much opinion about where we want to put stuff, but I'm not sure that the impact of the folder is going to be as great to the user as people might imagine
Where is this conversation happening, by the way? I'm surprised I didn't know about it until Nathaniel mentioned it when he started his thread -- since I'm on a bunch of lists (python-dev, Distutils-SIG, etc).
It hasn't been formally presented to any lists yet, but the initial informal discussion is here: https://github.com/kushaldas/peps/pull/1 (I guess this is OK to share, since it's also linked from here: https://github.com/python/peps/pull/776) -n -- Nathaniel J. Smith -- https://vorpus.org
On Sun, Sep 30, 2018 at 6:48 AM Nathaniel Smith <njs@pobox.com> wrote:
So I think now might be a time for a bit of top-down design. **I want a picture of the elephant.** If we had that, maybe we could see how all these different ideas could be put together into a coherent whole. So at the Python core sprint a few weeks ago, I dragged some interested parties [4] into a room with a whiteboard [5], and we made a start at it. And now I'm writing it up to share with you all.
Just curious: Have we directly engaged the author of Poetry <https://github.com/sdispater/poetry> to see if he is interested in participating in these discussions? I ask partly just as an interested observer, partly because I see that Pipenv tends to dominate these discussions, and partly because I find Poetry more appealing than Pipenv <https://github.com/sdispater/poetry#what-about-pipenv> and -- not being a packaging expert -- I want to see it discussed in more depth by the experts here. Nick
I can’t speak for others (also not really sure what “we” should include here…), but I have a couple of interactions with the author on Twitter. I can’t recall whether I invited him to join distutils-sig specifically, but I would understand if he was reluctant to do so even if I did. The mailing list could be a bit intimidating unless you have a good topic to join, especially for someone not with an English-speaking background (I am talking from experience here). Overall I could see it be a good idea to invite him to join the mailing list, and/or provide inputs on this particular discussion. Would you be interested in doing this? TP
On 01/10, 2018, at 01:37, Nicholas Chammas <nicholas.chammas@gmail.com> wrote:
On Sun, Sep 30, 2018 at 6:48 AM Nathaniel Smith <njs@pobox.com <mailto:njs@pobox.com>> wrote: So I think now might be a time for a bit of top-down design. **I want a picture of the elephant.** If we had that, maybe we could see how all these different ideas could be put together into a coherent whole. So at the Python core sprint a few weeks ago, I dragged some interested parties [4] into a room with a whiteboard [5], and we made a start at it. And now I'm writing it up to share with you all.
Just curious: Have we directly engaged the author of Poetry <https://github.com/sdispater/poetry> to see if he is interested in participating in these discussions?
I ask partly just as an interested observer, partly because I see that Pipenv tends to dominate these discussions, and partly because I find Poetry more appealing than Pipenv <https://github.com/sdispater/poetry#what-about-pipenv> and -- not being a packaging expert -- I want to see it discussed in more depth by the experts here.
Nick
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/N...
On Sun, Sep 30, 2018 at 2:17 PM Tzu-ping Chung <uranusjr@gmail.com> wrote:
I can’t speak for others (also not really sure what “we” should include here…), but I have a couple of interactions with the author on Twitter. I can’t recall whether I invited him to join distutils-sig specifically, but I would understand if he was reluctant to do so even if I did. The mailing list could be a bit intimidating unless you have a good topic to join, especially for someone not with an English-speaking background (I am talking from experience here).
Overall I could see it be a good idea to invite him to join the mailing list, and/or provide inputs on this particular discussion. Would you be interested in doing this?
Sure, I'll ping him and point to this thread and see if he is interested in participating. Nick
participants (7)
-
Chris Jerdonek
-
Dan Ryan
-
Nathaniel Smith
-
Nicholas Chammas
-
Paul Moore
-
Thomas Kluyver
-
Tzu-ping Chung