
Pip and venv have done a lot to improve the accessibility and ease of installing python packages, but I believe there is still a lot of room for improvement. I only realised how cumbersome I find working with python packages when I recently spent a lot of time on a javascript project using npm. A bit of googling and I found several articles discussing pip, venv and npm, and all of them seemed to say the same thing, i.e. pip/venv could learn a lot from npm. My proposal revolves around two issues: 1. Setting up and working with virtual environments can be onerous. Creating one is easy enough, but using them means remembering to run `source activate` every time, which also means remembering which venv is used for which project. Not a major issue, but still and annoyance. 2. Managing lists of required packages is not nearly as easy as in npm since these is no equivalent to `npm install --save ...`. The best that pip offers is `pip freeze`. Howevere, using that is a) an extra step to remember and b) includes all implied dependencies which is not ideal. My proposal is to use a similar model to npm, where each project has a `venvrc` file which lets python-related tools know which environment to use. In order to showcase the sort of funcionality I'm proposing, I've created a basic example on github (https://github.com/aquavitae/pyle). This is currently py3.4 on linux only and very pre-alpha. Once I've added a few more features that I have in mind (e.g. multiple venvs) I'll add it to pypi and if there is sufficient interest I'd be happy to write up a PEP for getting it into the stdlib. Does this seem like the sort of tool that would be useful in the stdlib? Regards David

On May 31, 2015, at 00:16, David Townshend <aquavitae69@gmail.com> wrote:
If you're not using virtualenvwrapper. You do have to get used to using workon instead of cd to switch between environments--although if you want to, there's a hook you can alias cd to (virtualenvwrapperhelper). And I haven't tried either the native Windows cmd or PowerShell ports or the PowerShell port (it works great with MSYS bash, but I realize not everyone on Windows wants to pretend they're not on Windows). And managing multiple environments with different Python versions (at least different versions of 2.x or different versions of 3.x) could be nicer. But I think it does 90% of what you're looking for, and I think it might be easier to add the other 10% to virtualenvwrapper than to start from scratch. And it works with 2.6-3.3 as well as 3.4+ (with virtualenv instead of venv, of course), on most platforms. with multiple environments, with tab completion (at least in bash and zsh), etc.

On Sun, May 31, 2015 at 9:35 AM, Andrew Barnert <abarnert@yahoo.com> wrote:
Virtualenvwrapper does help a bit, but nowhere near 90%. It doesn't touch any of the issues with pip, it still requires configuration and manually ensuring that the venv is activated. But the biggest issue with extending it is that it has a totally different workflow philosophy in that it enforces a separation between the venv and the project, whereas my proposal involves more integration of the two. I have used virtualenvwrapper quite a bit in the past, but in the end I've always found it easier to just work with venv because of the lack of flexibiltiy in where and how I store the venvs.

On May 31, 2015, at 01:01, David Townshend <aquavitae69@gmail.com> wrote:
As I already mentioned, if you use virtualenvwrapperhelper or autoenv, you don't need to manually ensure that the venv is activated. I personally use it by having workon cd into the directory for me instead of vice-versa, but if you like vice-versa, you can do it that way, so every time you cd into a directory with a venv in, it activates.
But the biggest issue with extending it is that it has a totally different workflow philosophy in that it enforces a separation between the venv and the project,
I don't understand what you mean. I have a one-to-one mapping between venvs and projects (although you _can_ have multiple projects using the same venv, that isn't the simplest way to use it), and I have everything checked into git together, and I didn't have to do anything complicated to get there.
whereas my proposal involves more integration of the two. I have used virtualenvwrapper quite a bit in the past, but in the end I've always found it easier to just work with venv because of the lack of flexibiltiy in where and how I store the venvs.
The default for npm is that your package dir is attached directly to the project. You can get more flexibility by setting an environment variable or creating a symlink, but normally you don't. It has about the same flexibility as virtualenvwrapper, with about the same amount of effort. So if virtualenvwrapper isn't flexible enough for you, my guess is that your take on npm won't be flexible enough either, it'll just come preconfigured for your own idiosyncratic use and everyone else will have to adjust...

+1 for this idea David. I am using requirements.txt for managing dependencies but the NPM approach is simpler than doing pip freeze, inspecting what are the requirements we really use and setting up a virtualenv. If you need help with the PEP writing I can help you. Em dom, 31 de mai de 2015 às 09:45, Andrew Barnert via Python-ideas < python-ideas@python.org> escreveu:

You have a point. Maybe lack of flexibility is not actually the issue - it's too much flexibility. The problem that I have with virtualenv is that it requires quite a bit of configuration and a great deal of awareness by the user of what is going on and how things are configured. As stated on it's home page While there is nothing specifically wrong with this, I usually just want a way to do something in a venv without thinking too much about where it is or when or how to activate it. If you've had a look at the details of the sort of tool I'm proposing, it is completely transparent. Perhaps the preconfiguration is just to my own idiosyncrasies, but if it serves its use 90% of the time then maybe that is good enough. Some of what I'm proposing could be incorporated in to pip (i.e. better requirements) and some could possibly be incorporated into virtualenvwrapper (although I still think that my proposal for handling venvs is just too different from that of virtualenvwrapper to be worth pursuing that course), but one of the main aims is to merge it all into one tool that manages both the venv and the requirements. I'm quite sure that this proposal is not going to accepted without a trial period on pypi, so maybe that will be the test of whether this is useful. Is this the right place for this, or would distutils-sig be better?

On May 31, 2015 11:20 AM, "David Townshend" <aquavitae69@gmail.com> wrote:
The default for npm is that your package dir is attached directly to the
As stated on it's home page While there is nothing specifically wrong with this, I usually just want a way to do something in a venv without
project. You can get more flexibility by setting an environment variable or creating a symlink, but normally you don't. I set variables in $VIRTUAL_ENV/bin/postactivate (for Python, Go, NPM, ...) [Virtualenvwrapper]. the user of what is going on and how things are configured. You must set WORKON_HOME and PROJECT_HOME. thinking too much about where it is or when or how to activate it. If you've had a look at the details of the sort of tool I'm proposing, it is completely transparent. Perhaps the preconfiguration is just to my own idiosyncrasies, but if it serves its use 90% of the time then maybe that is good enough.
Some of what I'm proposing could be incorporated in to pip (i.e. better
requirements) and some could possibly be incorporated into virtualenvwrapper (although I still think that my proposal for handling venvs is just too different from that of virtualenvwrapper to be worth pursuing that course), but one of the main aims is to merge it all into one tool that manages both the venv and the requirements. * you can install an initial set of packages with just virtualenv (a minimal covering / only explicitly installed packages would be useful (for pruning deprecated dependencies)) * conda-env manages requirements for conda envs (conda env export) * http://conda.pydata.org/docs/test-drive.html#managing-environments * http://conda.pydata.org/docs/env-commands.html * I've a similar script for working with virtualenv (now venv) and/or conda envs in gh:westurner/dotfiles/dotfiles/venv/ipython_config.py that sets FSH paths and more commands and aliases (like cdv for cdvirtualenv) . IDK whether this would be useful for these use cases. So: * [ ] ENH: pip freeze --minimum-covering * [ ] ENH: pip freeze --explicit-only * [ ] DOC: virtualenv for NPM'ers
I'm quite sure that this proposal is not going to accepted without a
trial period on pypi, so maybe that will be the test of whether this is useful.
Is this the right place for this, or would distutils-sig be better?
PyPA: https://github.com/mitsuhiko/pipsi/issues/44#issuecomment-105961957

On May 31, 2015, at 09:19, David Townshend <aquavitae69@gmail.com> wrote:
The default for npm is that your package dir is attached directly to the project. You can get more flexibility by setting an environment variable or creating a symlink, but normally you don't. It has about the same flexibility as virtualenvwrapper, with about the same amount of effort. So if virtualenvwrapper isn't flexible enough for you, my guess is that your take on npm won't be flexible enough either, it'll just come preconfigured for your own idiosyncratic use and everyone else will have to adjust...
You have a point. Maybe lack of flexibility is not actually the issue - it's too much flexibility.
I think Python needs that kind of flexibility, because it's used in a much wider range of use cases, from binary end-user applications to OS components to "just run this script against your system environment" to conda packages, not just web apps managed by a deployment team and other things that fall into the same model. And it needs to be backward compatible with the different ways people have come up with for handling all those models. While it's possible to rebuild all of those models around the npm model, and the node community is gradually coming up with ways of doing so (although notice that much of the node community is instead relying on docker or VMs...), you'd have to be able to transparently replace all of the current Python use cases today if you wanted to change Python today. Also, as Nick pointed out, making things easier for the developer comes at the cost of making things harder for the user--which is acceptable when the user is the developer himself or a deployment team that sits at the next set of cubicles, but may not be acceptable when the user is someone who just wants to run a script he found online. Again, the Node community is coming to terms with this, but they haven't got to the same level as the Python community, and, even if they had, it still wouldn't work as a drop-in replacement without a lot of work. What someone _could_ do is make it easier to set up a dev-friendly environment based on virtualenvwrapper and virtualenvwrapperhelper. Currently, you have to know what you're looking for and find a blog page somewhere that tells you how to install and configure all the tools and follow three or four steps. That's obvious less than ideal. It would be nice if there were a single "pip install envstuff" that got you ready out of the box (including working for Windows cmd and PowerShell), and if links to that were included in the basic Python docs. It would also be nice if there were a way to transfer your own custom setup to a new machine. But I don't see why that can't all be built as improvements on the existing tools (and a new package that just included requirements and configuration and no new tools).
The problem that I have with virtualenv is that it requires quite a bit of configuration and a great deal of awareness by the user of what is going on and how things are configured. As stated on it's home page While there is nothing specifically wrong with this, I usually just want a way to do something in a venv without thinking too much about where it is or when or how to activate it.
But again, if that's what you want, that's what you have with virtualenvwrapper or autoenv. You just cd into the directory (whether a new one you just created with the wrapper or an old one you just pulled from git) and it's set up for you. And setting up a new environment or cloning an existing one is just a single command, too. Sure, you can make your configuration more complicated than that, but if you don't want to, you don't have to.
If you've had a look at the details of the sort of tool I'm proposing, it is completely transparent. Perhaps the preconfiguration is just to my own idiosyncrasies, but if it serves its use 90% of the time then maybe that is good enough.
Some of what I'm proposing could be incorporated in to pip (i.e. better requirements) and some could possibly be incorporated into virtualenvwrapper (although I still think that my proposal for handling venvs is just too different from that of virtualenvwrapper to be worth pursuing that course), but one of the main aims is to merge it all into one tool that manages both the venv and the requirements.
There are major advantages in not splitting the Python community between two different sets of tools. We've only recently gotten past easy_install vs. pip and distribute vs. setuptools, which has finally enabled a clean story for everyone who wants to distribute packages to get it right, which has finally started to happen (although there are people still finding and following blog posts that tell them to install distribute or not to use virtualenv because it doesn't play nice with py2app or whatever).
I'm quite sure that this proposal is not going to accepted without a trial period on pypi, so maybe that will be the test of whether this is useful.
Is this the right place for this, or would distutils-sig be better?
Other people have made the case for both sides of that earlier in the thread and I'm not sure which one is more compelling... Also, the pure pip enhancement of coming up with something better than freeze/-r may belong on distutils-sig while the environment-aware launcher and/or environment-managing tools may belong here. (Notice that Python includes venv and the py launcher, but doesn't include setuptools or pip...)

On Sun, May 31, 2015 at 9:00 PM, Andrew Barnert <abarnert@yahoo.com> wrote:
Just to be clear, I'm not suggesting changing the python executable itself, or any of the other tools already in existence. My proposal is a separate wrapper around existing python, pip and venv which would not change anything about the way it works currently. A dev environment set up using it could still be deployed in the same way it would be now, and there would still be the option of using virtualenvwrapper, or something else for those that want to. It is obviously way too early to try to get it included in the next python release (apart form anything else, pip would need to be added first), so really this proposal is meant more to gauge interest in the concept so that if it is popular I can carry on developing it and preparing it for inclusion in the stdlib, or at least a serious discussion about including it, once it is mature. That said, Andrew's arguments have convinced me that much could be done to improve existing tools before creating a new one, although I still don't believe virtualenvwrapper can be squashed into the shape I'm aiming for without fundamental changes. Also, from the other responses so far it seems that the general feeling is that handling of requirements could definitely be improved, but that anything too prescriptive with venvs would be problematic. Unfortunately for my proposal, if something like what I'm suggesting were officially supported via inclusion in the stdlib it would quickly become, at best, the "strongly recommended" way of working and at worst the One Obvious Way. With all this in mind, I'll withdraw my proposal, but continue development on my version and see if it goes anywhere. I'll also see how much of it's functionality I can put into other tools (specifically pip's requirements handling) instead.

Sorry I am on mobile but I'd chime in a concern: David wrote:
This is his response to Stephen's comment. I'd like to point out that personally I haven't found a use case in development I would find trouble with source bin/activate and continue with my life. But in various places including #python channel I would hear helpers strongly advise to run /PTAH/TO/VIRTENV/bin/python which seems like a great idea, especially in the case of writing app startup script. So I'm not sure how autonenv, virtualenvwrapper are likeable if they configure something in behalf of users and if users when into trouble the users still have to unfold the doc yo find "Ohhh" moment. What I'm suggesting is I feel these recommendations are kind of contradicting. Maybe I am not convinced why source activate is bad yet because I have not really seen the pain with concrete example, just alwAys someone telling me that is a bad idea. Frankly I never like to posion my source directory with npm_modules and have to reconfigure where to save npm modules. So I still don't gain much. But Having pip to recognize that requirements.txt is there and install can be helpful. But helpful has to come with a price... Is it better for a user to learn what they are doing now or have they enjoy easy ride and then find a mud hole? On Sunday, May 31, 2015, David Townshend <aquavitae69@gmail.com> wrote:
-- Sent from Jeff Dean's printf() mobile console

On 31.05.2015 18:19, David Townshend wrote:
If you want to have a system that doesn't require activation, you may want to take a look at what we've done with PyRun: http://www.egenix.com/products/python/PyRun/ It basically takes the "virtual" out of virtualenvs. Instead of creating a local symlinked copy of your host Python installation, you create a completely separate Python installation (which isn't much heavier than a virtualenv due to the way this is done). Once installed, everything works relative to the PyRun binary, so you don't need to activate anything when running code inside your installation: you just need to run the right PyRun binary and this automatically gives you access to everything else you installed in your environment. In our latest release, we've added requirements.txt support to the installation helper install-pyrun, so that you can run install-pyrun -r requirements.txt . to bootstrap a complete project environment with one command. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Jun 01 2015)
::::: Try our mxODBC.Connect Python Database Interface for free ! :::::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/

Hi, On 06/01/2015 03:05 AM, M.-A. Lemburg wrote:
If you want to have a system that doesn't require activation, you may want to take a look at what we've done with PyRun:
Virtualenv doesn't require activation either. Activation is a convenience for running repeated commands in the virtualenv context, but all it does is change your shell PATH; you can explicitly specify the virtualenv's python binary and never use activation, if you wish.
Virtualenv doesn't create "a local symlinked copy of your host Python installation." It copies the binary, symlinks a few key stdlib modules that are necessary to bootstrap site.py, and then its custom site.py finds the host Python's stdlib directory and adds it to `sys.path`.
This is exactly how virtualenv (and pyvenv in Python 3.3+) works. Everything is relative to the Python binary in the virtualenv (this behavior is built into the Python executable, actually). You can just directly run the virtualenv's Python binary (or any script with that Python binary in its shebang, which includes all pip or easy-installed scripts in the virtualenv's bin/ dir), without ever activating anything. It seems the main difference between virtualenv and PyRun is in how much of the standard library is bundled with each environment, and that I guess PyRun doesn't come with any convenience activation shell script? But the method by which "activation" actually occurs is identical (at least as far as you're described it here.) Carl

On 01.06.2015 19:58, Carl Meyer wrote:
Ok, I was always under the impression that the activation script also does other magic to have the virtualenv Python find the right settings. That's good to know, thanks.
Well, this is what I call a symlinked copy :-) It still points to the system installed Python for the stdlib, shared mods and include files.
The main difference is that PyRun is a stand-alone Python runtime which doesn't depend on the system Python installation at all. We created it to no longer have to worry about supporting dozens of different Python installation variants on Unix platforms and it turned out to be small enough to just always use instead of virtualenv.
After what you've explained, the sys.path setup is indeed very similar (well, PyRun doesn't really need much of it since almost the whole Python stdlib is baked into the binary). What virtualenv doesn't appear to do is update sysconfig to point to the virtualenv environment instead of the host system.
-- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Jun 01 2015)
::::: Try our mxODBC.Connect Python Database Interface for free ! :::::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/

On Sun, May 31, 2015 at 09:16:57AM +0200, David Townshend wrote:
I don't think this is the right place to discuss either of those ideas. pip is not part of either the Python language or the standard library (apart from the very narrow sense that the most recent versions of Python include a tool to bootstrap pip). I think you should submit them on whatever forum pip uses to discuss feature suggestions. -- Steve

Steven D'Aprano writes:
I don't think this is the right place to discuss either of those ideas.
I think you're missing the point -- this is part of the larger discussion on packaging, as Alexander recognized ("shoot this over to distutils-sig", he said). While technically it may belong elsewhere (distutils, for example), the amount of attention it's attracting from core committers right now suggests that it's a real pain point, and should get discussion from the wider community while requirements are still unclear. While I'm not one for suggesting that TOOWTDI is obvious in advance (and not even if you're Dutch), surely it's worth narrowing down the field by looking at a lot of ideas.

On 31 May 2015 at 23:10, Stephen J. Turnbull <stephen@xemacs.org> wrote:
There are a plethora of environment management options out there, and https://github.com/pypa/python-packaging-user-guide/issues/118 discusses some of them (focusing specifically on the ad hoc environment management side of things rather than VCS linked environment management, though). The npm model in particular unfortunately gets a lot of its "simplicity" by isolating all the dependencies from each other during component development (including freely permitting duplicates and even different versions of the same component), so you get the excitement of live integration at runtime instead of rationalising your dependency set as part of your design and development process (see https://speakerdeck.com/nzpug/francois-marier-external-dependencies-in-web-a... ). As developers, we can make our lives *very* easy if we're happy to discount the interests of other folks that are actually tasked with deploying and maintaining our code (either an operations team if we have one, or at the very least future maintainers if we don't). So while there are still useful user experience lessons to be learned from npm, they require careful filtering to ensure they actually *are* a simplification of the overall user experience, rather than cases where the designers of the system have made things easier for developers working on the project itself at the expense of making them harder for operators and end users that just want to install it (potentially as part of a larger integrated system). Cheers, Nick. P.S. I've unfortunately never found the time to write up my own packaging system research properly, but https://bitbucket.org/ncoghlan/misc/src/default/talks/2013-07-pyconau/packag... has some rough notes from a couple of years ago, while https://fedoraproject.org/wiki/Env_and_Stacks/Projects/UserLevelPackageManag... looks at the general problem space from an operating system developer experience design perspective. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On May 31, 2015 at 11:05:24 AM, Nick Coghlan (ncoghlan@gmail.com) wrote:
One of the things that make NPM a lot simpler is that their “virtualenv” is implicit and the default, and you have to go out of your way to get a “global” install. It would be possible to add this to Python by doing something like ``sys.path.append(“./.python-modules/“)`` (but it also needs to recurse upwards) to the Python startup (and possibly some file you can put in that folder so that it doesn’t add the typical site-packages or user-packages to the sys.path. This makes it easier to have isolation being the default, however it comes with it’s own problems. It becomes a lot harder to determine what’s going to happen when you type ``python`` since you have to inspect the entire directory hierarchy above you looking for a .python_modules file. There’s also the problem that binary scripts tend to get installed into something like .python-modules/bin/ or so in that layout, but that’s rarely what people want. The npm community “solved” this by having the actual CLI command be installable on it’s own that will call into the main program that you have installed per project. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA

On May 31, 2015, at 00:16, David Townshend <aquavitae69@gmail.com> wrote:
If you're not using virtualenvwrapper. You do have to get used to using workon instead of cd to switch between environments--although if you want to, there's a hook you can alias cd to (virtualenvwrapperhelper). And I haven't tried either the native Windows cmd or PowerShell ports or the PowerShell port (it works great with MSYS bash, but I realize not everyone on Windows wants to pretend they're not on Windows). And managing multiple environments with different Python versions (at least different versions of 2.x or different versions of 3.x) could be nicer. But I think it does 90% of what you're looking for, and I think it might be easier to add the other 10% to virtualenvwrapper than to start from scratch. And it works with 2.6-3.3 as well as 3.4+ (with virtualenv instead of venv, of course), on most platforms. with multiple environments, with tab completion (at least in bash and zsh), etc.

On Sun, May 31, 2015 at 9:35 AM, Andrew Barnert <abarnert@yahoo.com> wrote:
Virtualenvwrapper does help a bit, but nowhere near 90%. It doesn't touch any of the issues with pip, it still requires configuration and manually ensuring that the venv is activated. But the biggest issue with extending it is that it has a totally different workflow philosophy in that it enforces a separation between the venv and the project, whereas my proposal involves more integration of the two. I have used virtualenvwrapper quite a bit in the past, but in the end I've always found it easier to just work with venv because of the lack of flexibiltiy in where and how I store the venvs.

On May 31, 2015, at 01:01, David Townshend <aquavitae69@gmail.com> wrote:
As I already mentioned, if you use virtualenvwrapperhelper or autoenv, you don't need to manually ensure that the venv is activated. I personally use it by having workon cd into the directory for me instead of vice-versa, but if you like vice-versa, you can do it that way, so every time you cd into a directory with a venv in, it activates.
But the biggest issue with extending it is that it has a totally different workflow philosophy in that it enforces a separation between the venv and the project,
I don't understand what you mean. I have a one-to-one mapping between venvs and projects (although you _can_ have multiple projects using the same venv, that isn't the simplest way to use it), and I have everything checked into git together, and I didn't have to do anything complicated to get there.
whereas my proposal involves more integration of the two. I have used virtualenvwrapper quite a bit in the past, but in the end I've always found it easier to just work with venv because of the lack of flexibiltiy in where and how I store the venvs.
The default for npm is that your package dir is attached directly to the project. You can get more flexibility by setting an environment variable or creating a symlink, but normally you don't. It has about the same flexibility as virtualenvwrapper, with about the same amount of effort. So if virtualenvwrapper isn't flexible enough for you, my guess is that your take on npm won't be flexible enough either, it'll just come preconfigured for your own idiosyncratic use and everyone else will have to adjust...

+1 for this idea David. I am using requirements.txt for managing dependencies but the NPM approach is simpler than doing pip freeze, inspecting what are the requirements we really use and setting up a virtualenv. If you need help with the PEP writing I can help you. Em dom, 31 de mai de 2015 às 09:45, Andrew Barnert via Python-ideas < python-ideas@python.org> escreveu:

You have a point. Maybe lack of flexibility is not actually the issue - it's too much flexibility. The problem that I have with virtualenv is that it requires quite a bit of configuration and a great deal of awareness by the user of what is going on and how things are configured. As stated on it's home page While there is nothing specifically wrong with this, I usually just want a way to do something in a venv without thinking too much about where it is or when or how to activate it. If you've had a look at the details of the sort of tool I'm proposing, it is completely transparent. Perhaps the preconfiguration is just to my own idiosyncrasies, but if it serves its use 90% of the time then maybe that is good enough. Some of what I'm proposing could be incorporated in to pip (i.e. better requirements) and some could possibly be incorporated into virtualenvwrapper (although I still think that my proposal for handling venvs is just too different from that of virtualenvwrapper to be worth pursuing that course), but one of the main aims is to merge it all into one tool that manages both the venv and the requirements. I'm quite sure that this proposal is not going to accepted without a trial period on pypi, so maybe that will be the test of whether this is useful. Is this the right place for this, or would distutils-sig be better?

On May 31, 2015 11:20 AM, "David Townshend" <aquavitae69@gmail.com> wrote:
The default for npm is that your package dir is attached directly to the
As stated on it's home page While there is nothing specifically wrong with this, I usually just want a way to do something in a venv without
project. You can get more flexibility by setting an environment variable or creating a symlink, but normally you don't. I set variables in $VIRTUAL_ENV/bin/postactivate (for Python, Go, NPM, ...) [Virtualenvwrapper]. the user of what is going on and how things are configured. You must set WORKON_HOME and PROJECT_HOME. thinking too much about where it is or when or how to activate it. If you've had a look at the details of the sort of tool I'm proposing, it is completely transparent. Perhaps the preconfiguration is just to my own idiosyncrasies, but if it serves its use 90% of the time then maybe that is good enough.
Some of what I'm proposing could be incorporated in to pip (i.e. better
requirements) and some could possibly be incorporated into virtualenvwrapper (although I still think that my proposal for handling venvs is just too different from that of virtualenvwrapper to be worth pursuing that course), but one of the main aims is to merge it all into one tool that manages both the venv and the requirements. * you can install an initial set of packages with just virtualenv (a minimal covering / only explicitly installed packages would be useful (for pruning deprecated dependencies)) * conda-env manages requirements for conda envs (conda env export) * http://conda.pydata.org/docs/test-drive.html#managing-environments * http://conda.pydata.org/docs/env-commands.html * I've a similar script for working with virtualenv (now venv) and/or conda envs in gh:westurner/dotfiles/dotfiles/venv/ipython_config.py that sets FSH paths and more commands and aliases (like cdv for cdvirtualenv) . IDK whether this would be useful for these use cases. So: * [ ] ENH: pip freeze --minimum-covering * [ ] ENH: pip freeze --explicit-only * [ ] DOC: virtualenv for NPM'ers
I'm quite sure that this proposal is not going to accepted without a
trial period on pypi, so maybe that will be the test of whether this is useful.
Is this the right place for this, or would distutils-sig be better?
PyPA: https://github.com/mitsuhiko/pipsi/issues/44#issuecomment-105961957

On May 31, 2015, at 09:19, David Townshend <aquavitae69@gmail.com> wrote:
The default for npm is that your package dir is attached directly to the project. You can get more flexibility by setting an environment variable or creating a symlink, but normally you don't. It has about the same flexibility as virtualenvwrapper, with about the same amount of effort. So if virtualenvwrapper isn't flexible enough for you, my guess is that your take on npm won't be flexible enough either, it'll just come preconfigured for your own idiosyncratic use and everyone else will have to adjust...
You have a point. Maybe lack of flexibility is not actually the issue - it's too much flexibility.
I think Python needs that kind of flexibility, because it's used in a much wider range of use cases, from binary end-user applications to OS components to "just run this script against your system environment" to conda packages, not just web apps managed by a deployment team and other things that fall into the same model. And it needs to be backward compatible with the different ways people have come up with for handling all those models. While it's possible to rebuild all of those models around the npm model, and the node community is gradually coming up with ways of doing so (although notice that much of the node community is instead relying on docker or VMs...), you'd have to be able to transparently replace all of the current Python use cases today if you wanted to change Python today. Also, as Nick pointed out, making things easier for the developer comes at the cost of making things harder for the user--which is acceptable when the user is the developer himself or a deployment team that sits at the next set of cubicles, but may not be acceptable when the user is someone who just wants to run a script he found online. Again, the Node community is coming to terms with this, but they haven't got to the same level as the Python community, and, even if they had, it still wouldn't work as a drop-in replacement without a lot of work. What someone _could_ do is make it easier to set up a dev-friendly environment based on virtualenvwrapper and virtualenvwrapperhelper. Currently, you have to know what you're looking for and find a blog page somewhere that tells you how to install and configure all the tools and follow three or four steps. That's obvious less than ideal. It would be nice if there were a single "pip install envstuff" that got you ready out of the box (including working for Windows cmd and PowerShell), and if links to that were included in the basic Python docs. It would also be nice if there were a way to transfer your own custom setup to a new machine. But I don't see why that can't all be built as improvements on the existing tools (and a new package that just included requirements and configuration and no new tools).
The problem that I have with virtualenv is that it requires quite a bit of configuration and a great deal of awareness by the user of what is going on and how things are configured. As stated on it's home page While there is nothing specifically wrong with this, I usually just want a way to do something in a venv without thinking too much about where it is or when or how to activate it.
But again, if that's what you want, that's what you have with virtualenvwrapper or autoenv. You just cd into the directory (whether a new one you just created with the wrapper or an old one you just pulled from git) and it's set up for you. And setting up a new environment or cloning an existing one is just a single command, too. Sure, you can make your configuration more complicated than that, but if you don't want to, you don't have to.
If you've had a look at the details of the sort of tool I'm proposing, it is completely transparent. Perhaps the preconfiguration is just to my own idiosyncrasies, but if it serves its use 90% of the time then maybe that is good enough.
Some of what I'm proposing could be incorporated in to pip (i.e. better requirements) and some could possibly be incorporated into virtualenvwrapper (although I still think that my proposal for handling venvs is just too different from that of virtualenvwrapper to be worth pursuing that course), but one of the main aims is to merge it all into one tool that manages both the venv and the requirements.
There are major advantages in not splitting the Python community between two different sets of tools. We've only recently gotten past easy_install vs. pip and distribute vs. setuptools, which has finally enabled a clean story for everyone who wants to distribute packages to get it right, which has finally started to happen (although there are people still finding and following blog posts that tell them to install distribute or not to use virtualenv because it doesn't play nice with py2app or whatever).
I'm quite sure that this proposal is not going to accepted without a trial period on pypi, so maybe that will be the test of whether this is useful.
Is this the right place for this, or would distutils-sig be better?
Other people have made the case for both sides of that earlier in the thread and I'm not sure which one is more compelling... Also, the pure pip enhancement of coming up with something better than freeze/-r may belong on distutils-sig while the environment-aware launcher and/or environment-managing tools may belong here. (Notice that Python includes venv and the py launcher, but doesn't include setuptools or pip...)

On Sun, May 31, 2015 at 9:00 PM, Andrew Barnert <abarnert@yahoo.com> wrote:
Just to be clear, I'm not suggesting changing the python executable itself, or any of the other tools already in existence. My proposal is a separate wrapper around existing python, pip and venv which would not change anything about the way it works currently. A dev environment set up using it could still be deployed in the same way it would be now, and there would still be the option of using virtualenvwrapper, or something else for those that want to. It is obviously way too early to try to get it included in the next python release (apart form anything else, pip would need to be added first), so really this proposal is meant more to gauge interest in the concept so that if it is popular I can carry on developing it and preparing it for inclusion in the stdlib, or at least a serious discussion about including it, once it is mature. That said, Andrew's arguments have convinced me that much could be done to improve existing tools before creating a new one, although I still don't believe virtualenvwrapper can be squashed into the shape I'm aiming for without fundamental changes. Also, from the other responses so far it seems that the general feeling is that handling of requirements could definitely be improved, but that anything too prescriptive with venvs would be problematic. Unfortunately for my proposal, if something like what I'm suggesting were officially supported via inclusion in the stdlib it would quickly become, at best, the "strongly recommended" way of working and at worst the One Obvious Way. With all this in mind, I'll withdraw my proposal, but continue development on my version and see if it goes anywhere. I'll also see how much of it's functionality I can put into other tools (specifically pip's requirements handling) instead.

Sorry I am on mobile but I'd chime in a concern: David wrote:
This is his response to Stephen's comment. I'd like to point out that personally I haven't found a use case in development I would find trouble with source bin/activate and continue with my life. But in various places including #python channel I would hear helpers strongly advise to run /PTAH/TO/VIRTENV/bin/python which seems like a great idea, especially in the case of writing app startup script. So I'm not sure how autonenv, virtualenvwrapper are likeable if they configure something in behalf of users and if users when into trouble the users still have to unfold the doc yo find "Ohhh" moment. What I'm suggesting is I feel these recommendations are kind of contradicting. Maybe I am not convinced why source activate is bad yet because I have not really seen the pain with concrete example, just alwAys someone telling me that is a bad idea. Frankly I never like to posion my source directory with npm_modules and have to reconfigure where to save npm modules. So I still don't gain much. But Having pip to recognize that requirements.txt is there and install can be helpful. But helpful has to come with a price... Is it better for a user to learn what they are doing now or have they enjoy easy ride and then find a mud hole? On Sunday, May 31, 2015, David Townshend <aquavitae69@gmail.com> wrote:
-- Sent from Jeff Dean's printf() mobile console

On 31.05.2015 18:19, David Townshend wrote:
If you want to have a system that doesn't require activation, you may want to take a look at what we've done with PyRun: http://www.egenix.com/products/python/PyRun/ It basically takes the "virtual" out of virtualenvs. Instead of creating a local symlinked copy of your host Python installation, you create a completely separate Python installation (which isn't much heavier than a virtualenv due to the way this is done). Once installed, everything works relative to the PyRun binary, so you don't need to activate anything when running code inside your installation: you just need to run the right PyRun binary and this automatically gives you access to everything else you installed in your environment. In our latest release, we've added requirements.txt support to the installation helper install-pyrun, so that you can run install-pyrun -r requirements.txt . to bootstrap a complete project environment with one command. -- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Jun 01 2015)
::::: Try our mxODBC.Connect Python Database Interface for free ! :::::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/

Hi, On 06/01/2015 03:05 AM, M.-A. Lemburg wrote:
If you want to have a system that doesn't require activation, you may want to take a look at what we've done with PyRun:
Virtualenv doesn't require activation either. Activation is a convenience for running repeated commands in the virtualenv context, but all it does is change your shell PATH; you can explicitly specify the virtualenv's python binary and never use activation, if you wish.
Virtualenv doesn't create "a local symlinked copy of your host Python installation." It copies the binary, symlinks a few key stdlib modules that are necessary to bootstrap site.py, and then its custom site.py finds the host Python's stdlib directory and adds it to `sys.path`.
This is exactly how virtualenv (and pyvenv in Python 3.3+) works. Everything is relative to the Python binary in the virtualenv (this behavior is built into the Python executable, actually). You can just directly run the virtualenv's Python binary (or any script with that Python binary in its shebang, which includes all pip or easy-installed scripts in the virtualenv's bin/ dir), without ever activating anything. It seems the main difference between virtualenv and PyRun is in how much of the standard library is bundled with each environment, and that I guess PyRun doesn't come with any convenience activation shell script? But the method by which "activation" actually occurs is identical (at least as far as you're described it here.) Carl

On 01.06.2015 19:58, Carl Meyer wrote:
Ok, I was always under the impression that the activation script also does other magic to have the virtualenv Python find the right settings. That's good to know, thanks.
Well, this is what I call a symlinked copy :-) It still points to the system installed Python for the stdlib, shared mods and include files.
The main difference is that PyRun is a stand-alone Python runtime which doesn't depend on the system Python installation at all. We created it to no longer have to worry about supporting dozens of different Python installation variants on Unix platforms and it turned out to be small enough to just always use instead of virtualenv.
After what you've explained, the sys.path setup is indeed very similar (well, PyRun doesn't really need much of it since almost the whole Python stdlib is baked into the binary). What virtualenv doesn't appear to do is update sysconfig to point to the virtualenv environment instead of the host system.
-- Marc-Andre Lemburg eGenix.com Professional Python Services directly from the Source (#1, Jun 01 2015)
::::: Try our mxODBC.Connect Python Database Interface for free ! :::::: eGenix.com Software, Skills and Services GmbH Pastor-Loeh-Str.48 D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg Registered at Amtsgericht Duesseldorf: HRB 46611 http://www.egenix.com/company/contact/

On Sun, May 31, 2015 at 09:16:57AM +0200, David Townshend wrote:
I don't think this is the right place to discuss either of those ideas. pip is not part of either the Python language or the standard library (apart from the very narrow sense that the most recent versions of Python include a tool to bootstrap pip). I think you should submit them on whatever forum pip uses to discuss feature suggestions. -- Steve

Steven D'Aprano writes:
I don't think this is the right place to discuss either of those ideas.
I think you're missing the point -- this is part of the larger discussion on packaging, as Alexander recognized ("shoot this over to distutils-sig", he said). While technically it may belong elsewhere (distutils, for example), the amount of attention it's attracting from core committers right now suggests that it's a real pain point, and should get discussion from the wider community while requirements are still unclear. While I'm not one for suggesting that TOOWTDI is obvious in advance (and not even if you're Dutch), surely it's worth narrowing down the field by looking at a lot of ideas.

On 31 May 2015 at 23:10, Stephen J. Turnbull <stephen@xemacs.org> wrote:
There are a plethora of environment management options out there, and https://github.com/pypa/python-packaging-user-guide/issues/118 discusses some of them (focusing specifically on the ad hoc environment management side of things rather than VCS linked environment management, though). The npm model in particular unfortunately gets a lot of its "simplicity" by isolating all the dependencies from each other during component development (including freely permitting duplicates and even different versions of the same component), so you get the excitement of live integration at runtime instead of rationalising your dependency set as part of your design and development process (see https://speakerdeck.com/nzpug/francois-marier-external-dependencies-in-web-a... ). As developers, we can make our lives *very* easy if we're happy to discount the interests of other folks that are actually tasked with deploying and maintaining our code (either an operations team if we have one, or at the very least future maintainers if we don't). So while there are still useful user experience lessons to be learned from npm, they require careful filtering to ensure they actually *are* a simplification of the overall user experience, rather than cases where the designers of the system have made things easier for developers working on the project itself at the expense of making them harder for operators and end users that just want to install it (potentially as part of a larger integrated system). Cheers, Nick. P.S. I've unfortunately never found the time to write up my own packaging system research properly, but https://bitbucket.org/ncoghlan/misc/src/default/talks/2013-07-pyconau/packag... has some rough notes from a couple of years ago, while https://fedoraproject.org/wiki/Env_and_Stacks/Projects/UserLevelPackageManag... looks at the general problem space from an operating system developer experience design perspective. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On May 31, 2015 at 11:05:24 AM, Nick Coghlan (ncoghlan@gmail.com) wrote:
One of the things that make NPM a lot simpler is that their “virtualenv” is implicit and the default, and you have to go out of your way to get a “global” install. It would be possible to add this to Python by doing something like ``sys.path.append(“./.python-modules/“)`` (but it also needs to recurse upwards) to the Python startup (and possibly some file you can put in that folder so that it doesn’t add the typical site-packages or user-packages to the sys.path. This makes it easier to have isolation being the default, however it comes with it’s own problems. It becomes a lot harder to determine what’s going to happen when you type ``python`` since you have to inspect the entire directory hierarchy above you looking for a .python_modules file. There’s also the problem that binary scripts tend to get installed into something like .python-modules/bin/ or so in that layout, but that’s rarely what people want. The npm community “solved” this by having the actual CLI command be installable on it’s own that will call into the main program that you have installed per project. --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
participants (12)
-
Alexander Walters
-
Andrew Barnert
-
André Freitas
-
Carl Meyer
-
David Townshend
-
Donald Stufft
-
John Wong
-
M.-A. Lemburg
-
Nick Coghlan
-
Stephen J. Turnbull
-
Steven D'Aprano
-
Wes Turner