I love pipx and I'm glad it exists at this point because it make
The main issue is that each virtualenv takes space, lots of space.
I have currently 57 apps installed via pipx on my laptop, and the 57
environments take almost 1 GB already.
~ cd .local/pipx/venvs/
~/.l/p/venvs ls
abilian-tools/ concentration/ gitlabber/ pygount/ sphinx/
ansible/ cookiecutter/ httpie/ pyinfra/ tentakel/
assertize/ cruft/ isort/ pylint/ tlv/
autoflake/ cython/ jupyterlab/ pyre-check/ towncrier/
black/ dephell/ lektor/ pytype/ tox/
borgbackup/ docformatter/ md2pdf/ pyupgrade/ twine/
borgmatic/ flake8/ medikit/ radon/ virtualenv/
bpytop/ flit/ mypy/ re-ver/ virtualfish/
check-manifest/ flynt/ nox/ sailboat/ vulture/
clone-github/ gh-clone/ pdoc3/ salvo/
cloneall/ ghtop/ pdocs/ shed/
com2ann/ gitchangelog/ pybetter/ sixer/
~/.l/p/venvs du -sh .
990M .
~/.l/p/venvs ls | wc
57 57 475
There is probably a clever way to reuse common packages (probably via
clever symlinking) and reduce the footprint of these installations.
Still, I'm glad that pipx exists as it is now, and that it has been
packaged on Ubuntu 20.04 and later (and probably other distros as well).
Having pipx (or something similar) installed by the distro, and the distro
focussed on packaging only the packages that are needed for its own sake,
means that we could go past the controversies between the Python community
and the Debian (or other distros) packagers community, which are based on
different goals and assumption, such as this one:
https://gist.github.com/tiran/2dec9e03c6f901814f6d1e8dad09528e
S.
On Wed, Feb 24, 2021 at 2:28 AM Paul Bryan
I think it's a classic case of dependency hell.
OS packagers are rebundling Python packages as OS packages and expressing their own OS-package dependency graphs. Then, you sudo pip install something that has a conflicting dependency, it bypasses OS packaging, and *boom*.
I find tools like pipx https://pipxproject.github.io/pipx/ go a long way to solve this, as they install a Python package and all of its dependencies in its own venv. This is great for Python apps, and (kinda) treats them like apps on platforms like Android, where all app dependencies are bundled and isolated from others.
I think it would great if OS vendors did something similar to pipx for Python-based apps: bundle the app and all of its dependencies into its own venv.
On Tue, 2021-02-23 at 19:45 -0500, Random832 wrote:
I was reading a discussion thread < https://gist.github.com/tiran/2dec9e03c6f901814f6d1e8dad09528e> about various issues with the Debian packaged version of Python, and the following statement stood out for me as shocking:
Christian Heimes wrote:
Core dev and PyPA has spent a lot of effort in promoting venv because we don't want users to break their operating system with sudo pip install.
I don't think sudo pip install should break the operating system. And I think if it does, that problem should be solved rather than merely advising users against using it. And why is it, anyway, that distributions whose package managers can't coexist with pip-installed packages don't ever seem to get the same amount of flak for "damaging python's brand" as Debian is getting from some of the people in the discussion thread? Why is it that this community is resigned to recommending a workaround when distributions decide the site-packages directory belongs to their package manager rather than pip, instead of bringing the same amount of fiery condemnation of that practice as we apparently have for *checks notes* splitting parts of the stdlib into optional packages? Why demand that pip be present if we're not going to demand that it works properly?
I think that installing packages into the actual python installation, both via distribution packaging tools and pip [and using both simultaneously - the Debian model of separated dist-packages and site-packages folders seems like a reasonable solution to this problem] can and should be a supported paradigm, and that virtual environments [or more extreme measures such as shipping an entire python installation as part of an application's deployment] should ideally be reserved for the rare corner cases where that doesn't work for some reason.
How is it that virtual environments have become so indispensable, that no-one considers installing libraries centrally to be a viable model anymore? Are library maintainers making breaking changes too frequently, reasoning that if someone needs the old version they can just venv it? Is there some other cause? _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/KMRNKS... Code of Conduct: http://python.org/psf/codeofconduct/
_______________________________________________ Python-Dev mailing list -- python-dev@python.org To unsubscribe send an email to python-dev-leave@python.org https://mail.python.org/mailman3/lists/python-dev.python.org/ Message archived at https://mail.python.org/archives/list/python-dev@python.org/message/2VYJAJHH... Code of Conduct: http://python.org/psf/codeofconduct/
-- Stefane Fermigier - http://fermigier.com/ - http://twitter.com/sfermigier - http://linkedin.com/in/sfermigier Founder & CEO, Abilian - Enterprise Social Software - http://www.abilian.com/ Chairman, National Council for Free & Open Source Software (CNLL) - http://cnll.fr/ Founder & Organiser, PyParis & PyData Paris - http://pyparis.org/ & http://pydata.fr/