I think it's a classic case of dependency hell. OS packagers are rebundling Python packages as OS packages and expressing their own OS-package dependency graphs. Then, you sudo pip install something that has a conflicting dependency, it bypasses OS packaging, and *boom*. I find tools like pipx go a long way to solve this, as they install a Python package and all of its dependencies in its own venv. This is great for Python apps, and (kinda) treats them like apps on platforms like Android, where all app dependencies are bundled and isolated from others. I think it would great if OS vendors did something similar to pipx for Python-based apps: bundle the app and all of its dependencies into its own venv. On Tue, 2021-02-23 at 19:45 -0500, Random832 wrote:
I was reading a discussion thread https://gist.github.com/tiran/2dec9e03c6f901814f6d1e8dad09528e about various issues with the Debian packaged version of Python, and the following statement stood out for me as shocking:
Christian Heimes wrote:
Core dev and PyPA has spent a lot of effort in promoting venv because we don't want users to break their operating system with sudo pip install.
I don't think sudo pip install should break the operating system. And I think if it does, that problem should be solved rather than merely advising users against using it. And why is it, anyway, that distributions whose package managers can't coexist with pip-installed packages don't ever seem to get the same amount of flak for "damaging python's brand" as Debian is getting from some of the people in the discussion thread? Why is it that this community is resigned to recommending a workaround when distributions decide the site-packages directory belongs to their package manager rather than pip, instead of bringing the same amount of fiery condemnation of that practice as we apparently have for *checks notes* splitting parts of the stdlib into optional packages? Why demand that pip be present if we're not going to demand that it works properly?
I think that installing packages into the actual python installation, both via distribution packaging tools and pip [and using both simultaneously - the Debian model of separated dist-packages and site-packages folders seems like a reasonable solution to this problem] can and should be a supported paradigm, and that virtual environments [or more extreme measures such as shipping an entire python installation as part of an application's deployment] should ideally be reserved for the rare corner cases where that doesn't work for some reason.
How is it that virtual environments have become so indispensable, that no-one considers installing libraries centrally to be a viable model anymore? Are library maintainers making breaking changes too frequently, reasoning that if someone needs the old version they can just venv it? Is there some other cause? _______________________________________________ Python-ideas mailing list -- python-ideas@python.org To unsubscribe send an email to python-ideas-leave@python.org https://mail.python.org/mailman3/lists/python-ideas.python.org/ Message archived at https://mail.python.org/archives/list/python-ideas@python.org/message/KMRNKS... Code of Conduct: http://python.org/psf/codeofconduct/