Pip and venv have done a lot to improve the accessibility and ease of installing python packages, but I believe there is still a lot of room for improvement. I only realised how cumbersome I find working with python packages when I recently spent a lot of time on a javascript project using npm. A bit of googling and I found several articles discussing pip, venv and npm, and all of them seemed to say the same thing, i.e. pip/venv could learn a lot from npm.My proposal revolves around two issues:
- Setting up and working with virtual environments can be onerous. Creating one is easy enough, but using them means remembering to run `source activate` every time, which also means remembering which venv is used for which project. Not a major issue, but still and annoyance.
- Managing lists of required packages is not nearly as easy as in npm since these is no equivalent to `npm install --save ...`. The best that pip offers is `pip freeze`. Howevere, using that is a) an extra step to remember and b) includes all implied dependencies which is not ideal.
My proposal is to use a similar model to npm, where each project has a `venvrc` file which lets python-related tools know which environment to use. In order to showcase the sort of funcionality I'm proposing, I've created a basic example on github (https://github.com/aquavitae/pyle). This is currently py3.4 on linux only and very pre-alpha. Once I've added a few more features that I have in mind (e.g. multiple venvs) I'll add it to pypi and if there is sufficient interest I'd be happy to write up a PEP for getting it into the stdlib.
Does this seem like the sort of tool that would be useful in the stdlib?
Regards
David
_______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/